If you like this content consider tipping in ETH, LINK or an ERC20 stablecoin at this address: 0xf9f6849230cBe10200B43dB103511b778898e71C
As explained in my post here, with Ethereum’s merge on the horizon it is time to explore its potential for environmental applications. BrightLink is a proof-of-concept system that incentivizes hypothetical communities to “brighten” their local environment as a local response to climate change. This is achieved using a Solidity contract on Ethereum that gets remote sensing data via a Chainlink oracle. The data provided to that oracle will come from an app that analyses drone or satellite remote sensing data. I am just starting out on my development journey with this project and will be documenting it here. This introductory post sets the scene – many of the details are sure to change as the project develops.
1) Environmental organizations are incentivized to “brighten” snow, ice and sea ice surfaces to slow their rate of melting. This is currently achieved occasionally using light-scattering sand or strategically-placed white sheets and has the effect of slowing sea ice retreat and prolonging snowpacks for ski resorts etc. The degree to which brightening occurs will be determined as a function of the deviation in the surface albedo derived from Sentinel-2 imagery relative to a baseline.
2) A community is incentivized to “green” their local environment by conserving and adding vegetated land (replanting verges, rooftop gardens, etc..). The payout is scaled by the % change in “green” area within the area of interest. The surface greening calculation is achieved using a supervised classification algorithm applied to multispectral Sentinel-2 satellite data.
3) Other applications could include incentivizing beach clean-ups by remotely quantifying beach litter using drone data, etc
The initial capital comes from a donor individual or organization who wishes to incentivizes a second organization or individual to brighten a specific area of interest. This initial donation is sent into escrow in the BrightLink Solidity contract in DAI.
The contract then places those in a DeFi strategy where they accrue interest. The interest is profit for the contract owner, generating income without decrementing the initial donated capital. When the consumer organization triggers a settlement, or some predetermined time has elapsed, the funds are pulled from the Aave pool, the interest that accrued is released to the contract owner as profit, and the initial capital remains in escrow in the contract, ready to be paid out at a rate that depends on the degree to which the consumer has achieved its goals.
The contract contains a function that triggers a Chainlink oracle to make a GET request to an API endpoint. That API endpoint is a json file that contains the results of a satellite remote sensing script run externally. When the remote sensing script finishes executing it updates the json file with the most recent remote sensing data. On request, the chainlink oracle ingests that data into the smart contract. This happens twice – first to establish an initial baseline which becomes the target value for the consumer organization to try to beat. Then, after some predetermined time has passed, the contract is called again and the updated remote sensing data is used to determine the amount of DAI that should be removed from the Aave pool and paid out.
Precise details of the remote sensing app are TBC, but I intend to build a Sentinel-2 supervised classification scheme and chlorophyll-index calculation to determine surface greening, and a narrow-broadband albedo calculation to determine surface brightening. The result will be some spatial statistic (e.g. area-mean albedo, total chlorophyll) compared to a baseline value.
This model system will be full decentralized except for the fact that it ingests data from a centralized source: my remote sensing app. This gives me too much power over the outcome of the contract and undermines the decentralized nature of the rest of the project. This is OK while this is a proof-of-concept dummy project, but if it were to deploy to mainnet one day, this would be a conceptual hurdle to overcome. Some options I have considered include:
a) building an aggregator that determines the critical data from multiple sources (e.g. Landsat, Sentinel, Planet) and keep the processing scripts as simple and transparent as possible so that the community can vet them easily
b) build the remote sensing scripts on Google Earth Engine and mirror them on Microsoft’s Planetary computer as well as making dockerized version available so they can be run by anyone and there is redundancy in case of a specific platform going down. Results from public runs of the script could be used to verify those of the “master” script that feeds the oracle with data.
c) use a DAO to vote on valid data sources using a governance token
I will not do much about this for now except for think about it and see if a sensible solution emerges as the project develops.