Cloud and Software Architecture, Soft skills, IOT and embedded
Generate Model schemas from JSON and JSON Schemas in 20 languages using quicktype.io
Get link
Facebook
X
Pinterest
Email
Other Apps
Sometimes you just want to create Model objects around some type of JSON payload of schema definition. It seems like you either end up building your own tool, doing the conversion manually, or hunting around to find some schema generator. Sometimes there just isn't a tool that you want to use and sometimes there is.
I'm working Dart where we want to create immutable objects from deserialized JSON schema for AdaptiveCards. I ran across an open-source tool, quicktype.io, that can generate JSON serializable model classes for over 20 different languages. There are several different settings that let do some customization. You can then consume those classes/functions as is or tune them to your needs. It is a great learning aid even if you don't use it for your models. I really like how it lets me compare how different languages handle models and JSON.
Disclaimer: I have no idea who created https://quicktype.io or how it is run. Pull down their GitHub repo and scan it for malware if you don't trust other websites with your JSON or JSON schema and need to run it locally.
Pick a language
The quicktype.io language selector. JSON serialization and deserialization are supported across all languages. Immutable and functional model objects are supported in some subset of the languages. My only experience is some exploration with Dart.
Generator Options for Various Languages
Programming languages have their on requirements and capabilities. The language generators each have their own feature settings.
Video
Creating an Immutable Serializable Dart Class from JSON
The Dart generator gives us an immutable, serializable Model object that looks like
The generators vary in completeness, capabilities and, how well they track language standards. IMO it is a good tool to see how pieces fit together even if you manually write your own classes or your own generator.
Some of the converters may be out of date with respect to the latest version of the language. You can always submit a pull request to bring it up to date. That is one of the beauties of OpenSource.
I do a lot of my development and configuration via ssh into my Raspberry Pi Zero over the RNDIS connection. Some models of the Raspberry PIs can be configured with gadget drivers that let the Raspberry pi emulate different devices when plugged into computers via USB. My favorite gadget is the network profile that makes a Raspberry Pi look like an RNDIS-attached network device. All types of network services travel over an RNDIS device without knowing it is a USB hardware connection. A Raspberry Pi shows up as a Remote NDIS (RNDIS) device when you plug the Pi into a PC or Mac via a USB cable. The gadget in the Windows Device Manager picture shows this RNDIS Gadget connectivity between a Windows machine and a Raspberry Pi. The Problem Windows 11 and Windows 10 no longer auto-installs the RNDIS driver that makes magic happen. Windows recognizes that the Raspberry Pi is some type of generic USB COM device. Manually running W indows Update or Upd...
The Windows Subsystem for Linux operates as a virtual machine that can dynamically grow the amount of RAM to a maximum set at startup time. Microsoft sets a default maximum RAM available to 50% of the physical memory and a swap-space that is 1/4 of the maximum WSL RAM. You can scale those numbers up or down to allocate more or less RAM to the Linux instance. The first drawing shows the default WSL memory and swap space sizing. The images below show a developer machine that is running a dev environment in WSL2 and Docker Desktop. Docker Desktop has two of its own WSL modules that need to be accounted for. You can see that the memory would actually be oversubscribed, 3 x 50% if every VM used its maximum memory. The actual amount of memory used is significantly smaller allowing every piece to fit. Click to Enlarge The second drawing shows the memory allocation on my 64GB laptop. WSL Linux defaul...
The Apache Tika project provides a library capable of parsing and extracting data and meta data from over 1000 file types. Tika is available as a single jar file that can be included inside applications or as a deployable jar file that runs Tika as a standalone service. This blog describes deploying the Tika jar as an auto-scale service in Amazon AWS Elastic Beanstalk. I selected Elastic Beanstalk because it supports jar based deployments without any real Infrastructure configuration. Elastic Beanstalk auto-scale should take care of scaling up and down for for the number of requests you get. Tika parses documents and extracts their text completely in memory. Tika was deployed for this blog using EC2 t2.micro instances available in the AWS free tier. t2.micro VMs are 1GB which means that you are restricted in document complexity and size. You would size your instances appropriately for your largest documents. Preconditions An AWS account. AWS ac...
Comments
Post a Comment