DesLauriers posted his code on GitHub; requires a local installation of Stable Diffusion and Node.JS. At this point, it’s an advanced prototype that requires some technical skill to set up, but it’s also a noteworthy example of the unexpected graphical innovations that can come from open-sourcing powerful image synthesis models. Stable Diffusion, which became open source on August 22, creates images from a neural network that has been trained on tens of millions of images retrieved from the Internet. Its ability to draw from a wide range of visual influences translates well to extracting color palette information. Other examples of palettes provided by DesLauriers include Tokyo Neon, which offers the colors of a vibrant Japanese cityscape, Vivid Coral, which echoes a coral reef with deep pinks and blues, and Green Garden, Blue Sky, which offers a rich pastoral scene DeLaurie took to Twitter earlier today to demonstrate how different quantization techniques (reducing the vast number of colors in an image to a few that represent the image) can produce different color palettes.