Flavours of On-Chain SVG NFTs on Ethereum

NFTs, as unique items on the blockchain, has a URI that points to data containing the metadata & the corresponding visuals. This URI can be an HTTP link, pointing to a video or image hosted on a normal server, or other services like IPFS (hash-based addresses), or Arweave (incentivized hosting of hash-based content).

There is another way, however, a format that's become increasingly popular: the usage of a data URI. These URIs contain all the information within it. There is thus no server at the other end. Using data URIs has allowed NFT creators to experiment with putting all the content related to an NFT 'on-chain'. It adds a vector of permanence to the art. If Ethereum continues, it does not need ancillary infrastructure to support it. A common format, currently, is to store the NFT visuals as SVG in the data URI, since most browsers are able to natively parse it.

It's fun, and due to some of the constraints of smart contract coding (limited execution & expensive storage), it becomes in itself a game of gas golf, trying to pack as much as one can into the architecture to create dynamic art that will live (currently) forever on Ethereum.

SVG on-chain?

Something that's important to remember with how Ethereum works. You must pay to upload code (in the form of a smart contract) to Ethereum. If you want to execute the code in order to create a change in the data stored in the smart contract, then, it costs you money (with ether). However, you are still able to execute the code without paying for it when the code does not produce a change in the state. To change state, you issue a transaction. To merely execute the code without changing state, it is commonly called, "calling" the smart contract. Examples of the latter include fetching variables, checking if transactions will succeed before they are issued to the network, and doing simple computations based on current state.

It does not mean that one can expect an Ethereum node to run a long, non-halting operation (lots of code). The computation still needs to comply with standard rules of a transaction even though it’s not submitted to the network as a transaction. So, the resulting computation needs to still be parsed as if it would fit into one Ethereum block. So, there is some ‘gas golf’ involved, where the goal is to ensure that the computation can indeed be generated within one Ethereum transaction (even though the transaction isn’t submitted to the network).

One such key feature is the ability to generate images from existing state. Thus, what on-chain SVG projects have in common is that they generate images when 'viewed' based on code stored IN the smart contract. They are rendered by 'calling' the smart contract and asking it to execute code based on the current state. This execution does not cost money* (in terms of ether being spent) because it's not changing the state, and thus, you can ask it produce/render an image.

Another way to put it, is that on-chain SVG NFTs are rendered when viewed.

*caveat: it’s not entirely ‘free’ to render, since the computation is still processed on an Ethereum node, whether it’s on your local machine, or a hosted service like Infura. But these calls are subsidised and dapp providers have to pay to hosted services to process these requests. But, it’s not determined by size of computation, merely by the amount of requests.

My project, Neolastics, is a simple example of rendering when ‘viewed’.

1) When you mint a piece, it gets assigned a unique, random, ID at time of the transaction.
2) When you call generateSVGFromTokenID(), it uses the ID nr as an input to choose 9 colours. The code then compiles an SVG square of 9 tiles with these 9 colours. The same id will always generate the same SVG output. More details, later.

How you wrangle SVG and put it on-chain is worth exploring as various projects have taken different approaches. The entry point to rendering the works start by calling the tokenURI() function. In some cases, although there are still links to HTTP in the metadata, the contract still stores other functions that allows it to recreate the image & metadata. As time went on, newer projects, however, have started putting it ALL, directly into the tokenURI().

Avastars

Avastars, the first project to use SVG on-chain.

Let’s take a look in the tokenURI with an ID (avastars metadata contract): https://etherscan.io/address/0x0ea3a9ffde8164bf680510e163f78150dc0274bd.

It points to an HTTP server, so one would expect it’s not on-chain? Not so: at that stage, data URIs weren’t readily supported by NFT viewers (even though it is a part of the ERC721 standard). So, even though the 1st entry point points to an HTTP server, it is all still retrievable on-chain. So, in the event that metadata server goes down, it’s still all reproducible.

Metadata is available by calling getAvastarMetadata():

First off, something to remember about EtherScan. It has a bug (quirk?) where these string calls are displayed in a wonky way. Commas are interpreted as new lines. Any space here needs to be replaced with a comma to read what’s actually being returned from the smart contract.

The AvastarsTeleporter contract (https://etherscan.io/address/0xf3e778f839934fc819cfa1040aabacecba01e049 ) contains the render functionality. If you call renderAvastar(), you will see the SVG.

Screenshot 2021-08-25 at 08.32.05.png

How is the SVG generated, however? How is it put together and generated?

Avastars works by concatenating the different traits. These traits were originally injected into the contract as SVG.

Screenshot 2021-08-25 at 08.40.47.png

An example of the SVG injection transaction, creating the trait when Avastars was deployed. https://etherscan.io/tx/0x800954149d30d98cff5d926d39504b1a184e976badc613fbc67a0226c1dc89b6

Screenshot 2021-08-25 at 08.44.34.png

Squiggly followed suit, by employing a similar format: HTTP metadata server, but still having on-chain code to reproduce the work if need be. Let’s call tokenURI() to see what comes up. https://etherscan.io/address/0x36f379400de6c6bcdf4408b282f8b685c56adc60

Screenshot 2021-08-25 at 08.47.03.png

But. Squiggly has a getIdtoSVG() function in case this server goes down:

But. How is the eventual image generated? In Squiggly, all of the SVG was uploaded into the smart contract upon deployment (vs injected as traits like Avastars).

From the seed, it initially sets up the gradient and creates the curves you see in the final image.

Squiggly.wtf, unlike Avastars, does not have any metadata on-chain.

Bonus points: if you follow the metadata on Squiggly.wtf, you’ll notice it has an “image_data” field. That is not part of the ERC721 standard. But. This was introduced by OpenSea as a work-around such that it parses the image field, not as an HTTP URI, but rather, as a data URI. This is not needed anymore, as most NFT marketplaces (like OpenSea), do directly support data URIs now. So, instead of “image_data”, you can do this today with just “image” in the metadata.

Neolastics

Neolastics follows a similar architecture to Squiggly.wtf. Uses an off-chain metadata server with the artwork reproducible on-chain. No on-chain metadata.

Let’s call tokenURI on the contract with an ID: https://etherscan.io/address/0xb2d6fb1dc231f97f8cc89467b52f7c4f78484044

Same. Uses off-chain server. However: generateSVGofTokenById() always allow us to recreate the image directly into SVG, even if the metadata server were to go offline in the future.

How are the pieces put together? It’s fairly similar to Squiggly in that the pieces are combined and changed from a seed in the render function itself. The SVG that is used to create the pieces were uploaded into the smart contract upon upload.

TinyBoxes follows a similar architecture.

Let’s call tokenURI(): https://etherscan.io/address/0x46f9a4522666d2476a5f5cd51ea3e0b5800e7f98

Same as before, it has a custom rendering function keeping the image on-chain, called tokenArt():

Screenshot 2021-08-25 at 09.57.50.png

Notably, it has an additional non-standard addition. Although the ERC721 metadata is not on-chain, the image’s own traits/metadata is stored as HTML tags in the image itself.

How does it put its SVG together? It’s a bit more complicated. It still builds the shapes and animations from a random seed, but it has a more complex way to combine the SVG elements themselves.

There’s an SVG.sol that abstracts out some of the complexity of doing string wrangling in the render function itself.

Same with adding animation elements: https://github.com/skyfly200/tiny-boxes/blob/master/contracts/libraries/Animation.sol

As you can see. Adding in SVG into Solidity isn’t exactly neat & clean, so doing these abstractions does help.

Mandalas

Mandalas follows suit, but does something new and interesting for the first time.

Let’s call tokenURI() with an ID: https://etherscan.io/address/0xDaCa87395f3b1Bbc46F3FA187e996E03a5dCc985.

Instead of pointing to a metadata server, you actually get the full JSON string + image! No HTTP Server required. At this point, data URIs weren’t broadly supported, but Mandalas still pushed and pioneered with it.

What’s going on here? First off: remember that etherscan has a bug/quirk where commas are interpreted as a new line. So, the full data URI above has a comma after xml on the 1st line, and after base64 on the second line. If you copy the entire image (including adding in the commas) and pasting it in your browser, the corresponding Mandala will appear!

Something new as well is the base64 encoding (for the image). This is because a URI has several special characters that affects the URI itself. Thus, to render the gif, it has to be encoded into base64: which is safer. Because their SVG is safe (no special characters) it wasn’t necessary to encode into base64. But, it’s quite easy to accidentally add in unsafe characters. eg, using # for CSS styling trips up the URI. Encoding to base64 does have trade-offs however: it costs more gas.

How does it put its SVG together?

Mandalas uses a unique rendering system where it paints over a templated Mandala with new base64 encoded pixels.

UniSwap V3 NFTs

It also animates! Click through! https://opensea.io/assets/0xc36442b4a4522e871399cd717abdd847ab11fe88/102626

It also animates! Click through! https://opensea.io/assets/0xc36442b4a4522e871399cd717abdd847ab11fe88/102626

In UniSwap V3, the positions are unique and are traded as NFTs. It was thus a great opportunity to create cool art for these positions. Due to the size of the UniSwap team, they helped lobby platforms to fully support data URIs across the board (as it should’ve been from the beginning of the ERC721 standard).

Let’s call tokenURI(): https://etherscan.io/address/0xc36442b4a4522e871399cd717abdd847ab11fe88

Screenshot 2021-08-25 at 11.15.40.png

It’s all base64! The safest encoding. If you decode this, you’ll get the JSON for the metadata. You’ll then find that the “image” field is also SVG encoded into base64. If you decode that again, you get the sweet SVG at the end that turns into the image you see above! Great!

How does it put its SVG together?

This is quite complicated since it has many moving parts, some of it being generated from the underlying financial position. It offloads the generation to separate contracts: NFTDescriptor.sol & NFTSVG.sol. Again, abstracting some of the complexities away into various functions.

https://github.com/Uniswap/uniswap-v3-periphery/blob/main/contracts/libraries/NFTDescriptor.sol#L44

https://github.com/Uniswap/uniswap-v3-periphery/blob/main/contracts/libraries/NFTDescriptor.sol#L44

As you can see again. Solidity wasn’t really made to parse all this SVG. Can get quite hairy when the project balloons in complexity. 😅

From: https://github.com/Uniswap/uniswap-v3-periphery/blob/main/contracts/libraries/NFTSVG.sol

From: https://github.com/Uniswap/uniswap-v3-periphery/blob/main/contracts/libraries/NFTSVG.sol

Still. It works! And it’s pretty. :)

Anchor Certificates

Another of my projects for Untitled Frontier. Anchor Certificates follows suit and puts everything on-chain.

Let’s call tokenURI(): https://etherscan.io/address/0x600a4446094c341693c415e6743567b9bfc8a4a8

Screenshot 2021-08-25 at 12.10.14.png

Same as UniSwap. All on-chain.

How does it put the SVG together? This is simpler. Some components are extracted into separate functions. One reason for that is that Solidity can’t keep a big callstack (too many local variables), so putting it into functions allows it to shed recently used variables before continuing with the computation.

https://opensea.io/assets/0x8d04a8c79ceb0889bdd12acdf3fa9d207ed3ff63/484

https://opensea.io/assets/0x8d04a8c79ceb0889bdd12acdf3fa9d207ed3ff63/484

Blitmap follows a more traditional architecture with both image + metadata being off-chain.

Let’s call tokenURI(): https://etherscan.io/address/0x8d04a8c79ceb0889bdd12acdf3fa9d207ed3ff63

Screenshot 2021-08-25 at 11.55.29.png

But. Same as other projects like Neolastics & Avastars, you can generate the image from a custom function.

Screenshot 2021-08-25 at 11.57.42.png

It’s all rects? Yes. What’s interesting about blitmap is that it’s essentially creating these small images from rect ‘pixels’, generating many rects to get the final image. The image is injected as 268 bytes of pixels (through mintOriginal() or mintVariant()) onto the chain and then recreated from this data.

Nouns

Nouns also put all metadata + SVG on-chain.

Let’s call tokenURI(): https://etherscan.io/address/0x9c8ff314c9bc7f6e59a9d9225fb22946427edc03

Screenshot 2021-08-25 at 12.30.46.png

Nouns seems to follow a combination between UniSwap V3 NFTs & blitmaps: having a separate descriptor contract (NounsDescriptor.sol) and using similar function naming. But, it follows blitmaps in having the artwork encoded into parts, and then combined into a blob of SVG rectangles. While the idea of the encoding is similar, Nouns have a technique where instead of it being rect of ‘pixels’, similar coloured parts are grouped into one SVG rectangle instead. It saves on the size of the SVG output.

Sharing similarity with Avastars, the traits were added to the Descriptor.sol contract. This is different to blitmaps, which had the entire image submitted.

Screenshot 2021-08-25 at 14.02.29.png

A mixture. This isn’t entirely to standard since it contains special characters that’s not allowed in the data URI. But, as we’ve seen with many variations of on-chain artworks, as long as some of it is on there, it’s more readily reproducible.

How are the SVG’s generated in solSeedlings?

Depending on the collection, solSeedlings is all custom, also using a seed for its randomness.

What’s next?

There are some other projects that use SVG on-chain, so this isn’t an exhaustive list. However, I wanted to showcase how projects are using it. I hope this helps you understand how projects are putting the metadata & SVG on-chain in different ways, as well as exploring unique ways of rendering and encoding the images.

This is also just a subset of what on-chain artwork projects are happening. In some instances, like 0xmon, the image is stored as calldata and then retrieved as a gif. Another example is brotchain, doing the encoding as bitmaps. ArtBlocks, the most popular generative art marketplace stores the scripts to render the artworks on-chain. But, these scripts require usage of additional off-chain libraries (such as p5.js or three.js).

While you can argue ad-infinitum of whether something is fully on-chain or not, at least there’s very interesting experiments happening with placing art closer to the barebones of Ethereum. SVG is currently a popular format due its nativity in the browser, but who knows what else will be produced into the future? I’m sure we’ll see more standards developed (eg, Nouns using a custom encoding system). For now, it’s a burgeoning scene with beautiful art being produced.

Hope you join in on the fun!

PS. If I got any details incorrect, do let me know so I can fix it.

Previous
Previous

Decentralized Autonomous Artists

Next
Next

Top-Down vs Bottom-Up Fiction Using NFTs