<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>carboncast | UCSC OSPO</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/tag/carboncast/</link><atom:link href="https://deploy-preview-1007--ucsc-ospo.netlify.app/tag/carboncast/index.xml" rel="self" type="application/rss+xml"/><description>carboncast</description><generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Mon, 15 Sep 2025 00:00:00 +0000</lastBuildDate><item><title>Final Report: CarbonCast — An end-to-end consumption-based Carbon Intensity Forecasting service</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250915-tanushsavadi/</link><pubDate>Mon, 15 Sep 2025 00:00:00 +0000</pubDate><guid>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250915-tanushsavadi/</guid><description>&lt;p>Hi everyone—this is my final report for &lt;strong>CarbonCast&lt;/strong>, mentored by &lt;strong>Professor Abel Souza&lt;/strong>. Back in June, my goal was simple to say and harder to pull off: help people &lt;strong>see&lt;/strong> when the grid is cleaner and make it easy to act on that information. Over the summer I turned CarbonCast from a research prototype into something you can open, click, and rely on: a containerized backend, a clean API, and a fast, friendly map UI.&lt;/p>
&lt;h2 id="background">Background&lt;/h2>
&lt;p>CarbonCast forecasts the &lt;strong>carbon intensity&lt;/strong> of electricity (gCO₂e/kWh) using grid data and weather. Earlier versions were accurate but difficult to run and even harder to use outside a research context. My OSRE focus was to make CarbonCast usable for real people: provide a standard API, build a web UI that feels responsive, and package everything so it starts quickly and keeps itself healthy.&lt;/p>
&lt;h2 id="goals">Goals&lt;/h2>
&lt;p>I centered the work around four goals. First, I wanted to &lt;strong>ship an end-to-end containerized stack&lt;/strong>—data collection, validation, storage, API, and UI—that someone else could run without digging through my notes. Second, I aimed to &lt;strong>expand coverage&lt;/strong> beyond a handful of regions so the map would be genuinely useful. Third, I needed to &lt;strong>make it reliable&lt;/strong>, with retries, monitoring, and graceful fallbacks so the system could run for weeks without babysitting. Finally, I wanted to &lt;strong>lay the groundwork for a consumption-based signal&lt;/strong>, because imports from neighboring regions also shape a region’s true emissions picture.&lt;/p>
&lt;h2 id="what-i-built">What I built&lt;/h2>
&lt;p>By the end of the program, CarbonCast runs as a &lt;strong>containerized backend + API + web app&lt;/strong> that you can bring up with Docker. The pipelines now reach &lt;strong>85+ regions&lt;/strong>, and the UI currently exposes &lt;strong>58+&lt;/strong> while we finish integrating the rest. The API offers straightforward endpoints for current conditions and multi-day views, plus region metadata so clients can discover what’s available. The UI presents an &lt;strong>interactive choropleth map&lt;/strong> with a side panel for the &lt;strong>energy mix&lt;/strong> and a simple &lt;strong>timeline&lt;/strong> to move between past, now, and the next few days. To keep things feeling snappy, I tuned caching so “now” data updates quickly while historical and forecast views load instantly from cache. I also added a small &lt;strong>“mission control” dashboard&lt;/strong> that shows what updated, what failed, and how the system recovered, which makes maintenance far less mysterious.&lt;/p>
&lt;h2 id="how-it-works">How it works&lt;/h2>
&lt;p>Fresh weather and grid data arrive on a regular schedule. The system checks each file for sanity, stores it, and serves it through a clean API. The React app calls that API and paints the map. Hovering reveals regional details; clicking opens a richer panel with the energy mix and trends; the timeline lets you scrub through hours naturally. In short, the path is &lt;strong>fresh data → API → map&lt;/strong>, and each step is designed to be obvious and quick.&lt;/p>
&lt;p>Behind the scenes, I extended the existing Django backend with a &lt;strong>SQLite path&lt;/strong> so the UI works out of the box on a laptop. For production, you can point the same code at Postgres or MySQL without changing the UI. This choice made local testing easy while leaving room for scale later.&lt;/p>
&lt;h2 id="highlights">Highlights&lt;/h2>
&lt;p>A few moments stand out. The first time the dashboard flipped from red to green on its own—after the system retried through a wave of timeouts—was a turning point. Clicking across the map and getting instant responses because the right data was cached felt great too. And packaging everything so another person can run it without asking me for help might be the biggest quality-of-life win for future contributors.&lt;/p>
&lt;h2 id="challenges">Challenges&lt;/h2>
&lt;p>The first big hurdle was &lt;strong>refactoring the old vanilla-JS interface&lt;/strong>. The original UI worked, but it was dated and hard to extend. I rebuilt it as a modern React + TypeScript app with a cleaner component structure and a fresh look—think &lt;strong>glassmorphic panels&lt;/strong>, readable color scales, and a layout that feels consistent on both laptops and smaller screens. Moving to this design system made the codebase far easier to maintain, theme, and iterate on.&lt;/p>
&lt;p>The next challenge was &lt;strong>performance under real-time load&lt;/strong>. With dozens of regions updating, it was easy to hit API limits and make the UI feel jittery. I solved this by adding a smart &lt;strong>caching layer&lt;/strong> with short, volatility-aware timeouts, request de-duplication, and background prefetching. That combination dramatically reduced round-trips, essentially &lt;strong>eliminated rate-limit hits&lt;/strong>, and made the map feel responsive even as you scrub through time. The result is a UI that can handle many simultaneous updates &lt;strong>without hiccups&lt;/strong>.&lt;/p>
&lt;p>Finally, there were plenty of &lt;strong>stubborn UI bugs&lt;/strong>. Some regions wouldn’t color even when data was available, certain charts refused to render, and a few elements flickered or never showed up. Most of this came down to learning &lt;strong>React state management&lt;/strong> in a real project: taming race conditions, canceling in-flight requests when users navigate, and making sure state only updates when fresh data actually arrives. Fixing those issues taught me a lot about how maps re-paint, how charts expect their data, and how to keep components simple enough that they behave the way users expect.&lt;/p>
&lt;h2 id="what-didnt-make-the-cut-yet">What didn’t make the cut (yet)&lt;/h2>
&lt;p>I designed—but did not finish—&lt;strong>per-region plug-in models&lt;/strong> so each grid can use the approach that fits it best. We decided to ship a stable, deployable service first and reserve that flexibility work for the next phase. The design is written down and ready to build.&lt;/p>
&lt;h2 id="links-and-resources">Links and resources:&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Project page:&lt;/strong> &lt;a href="project/osre25/ucsc/carboncast/">CarbonCast&lt;/a>&lt;/li>
&lt;li>&lt;strong>Proposal:&lt;/strong> &lt;a href="https://ucsc-ospo.github.io/report/osre25/ucsc/carboncast/20250710-tanushsavadi/" target="_blank" rel="noopener">https://ucsc-ospo.github.io/report/osre25/ucsc/carboncast/20250710-tanushsavadi/&lt;/a>&lt;/li>
&lt;li>&lt;strong>Midterm blog:&lt;/strong> &lt;a href="https://ucsc-ospo.github.io/report/osre25/ucsc/carboncast/20250803-tanushsavadi/" target="_blank" rel="noopener">https://ucsc-ospo.github.io/report/osre25/ucsc/carboncast/20250803-tanushsavadi/&lt;/a>&lt;/li>
&lt;li>&lt;strong>Backend/API (branch):&lt;/strong> &lt;a href="https://github.com/carbonfirst/CarbonCast/tree/django_apis_sqlite" target="_blank" rel="noopener">https://github.com/carbonfirst/CarbonCast/tree/django_apis_sqlite&lt;/a>&lt;/li>
&lt;li>&lt;strong>Frontend/UI:&lt;/strong> &lt;a href="https://github.com/carbonfirst/CarbonCastUI/tree/main" target="_blank" rel="noopener">https://github.com/carbonfirst/CarbonCastUI/tree/main&lt;/a>&lt;/li>
&lt;/ul>
&lt;h2 id="whats-next">What’s next&lt;/h2>
&lt;p>My next steps are clear. I want to finish the &lt;strong>per-region model plug-ins&lt;/strong> so grids can bring their own best forecasting logic. I also plan to carry the &lt;strong>consumption-based&lt;/strong> signal end-to-end, including imports and interconnects surfaced directly in the UI. Finally, I’ll harden the system for production by enabling auth and throttling and by moving to a production-grade database where appropriate.&lt;/p>
&lt;h2 id="thank-you">Thank you&lt;/h2>
&lt;p>Huge thanks to &lt;strong>Professor Abel Souza&lt;/strong> for steady mentorship and to the &lt;strong>OSRE&lt;/strong> community for thoughtful feedback. The most rewarding part of this summer was watching a research idea become something people can &lt;strong>click on—and use&lt;/strong> to make cleaner choices.&lt;/p></description></item><item><title>Midterm blog: CarbonCast Midpoint Update: From Vision to Reality</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/</link><pubDate>Sun, 03 Aug 2025 00:00:00 +0000</pubDate><guid>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/</guid><description>&lt;p>A few months ago, I shared my vision for making carbon intensity forecasts more accessible through the &lt;a href="https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre25/ucsc/carboncast">CarbonCast project&lt;/a>. My &lt;a href="https://summerofcode.withgoogle.com/programs/2025/projects/7yvAix3k" target="_blank" rel="noopener">proposal&lt;/a> under the mentorship of Professor Abel Souza aims to build an API that makes carbon intensity forecasts more accessible and actionable. I had two main goals: expand CarbonCast to work with more regional electricity grids, and transform it from a research project into something that could actually run and be interacted with in the real world.&lt;/p>
&lt;p>Today, I&amp;rsquo;m excited to share that we&amp;rsquo;ve not only hit those goals – we&amp;rsquo;ve exceeded them in ways I didn&amp;rsquo;t expect.&lt;/p>
&lt;h2 id="what-weve-built-so-far">What We&amp;rsquo;ve Built So Far&lt;/h2>
&lt;p>Remember how I mentioned that CarbonCast needed to support more regional grids? Well, we&amp;rsquo;ve gone big. The system now covers 85+ regions across two continents. We&amp;rsquo;re talking about major US grid operators like ERCOT (Texas), CISO (California), PJM (Mid-Atlantic), MISO (Midwest), and NYISO (New York), plus we&amp;rsquo;ve expanded into European countries like Germany, France, Spain, and the UK.&lt;/p>
&lt;p>But here&amp;rsquo;s the thing – collecting weather data for carbon intensity forecasting isn&amp;rsquo;t as simple as just downloading a few files. Each region needs four different types of weather data: solar radiation (for solar power predictions), wind patterns (for wind power), temperature and humidity (for energy demand), and precipitation (which affects both supply and demand). That means we&amp;rsquo;re managing data collection for over 340 different combinations of regions and weather variables.&lt;/p>
&lt;h2 id="the-automation-challenge">The Automation Challenge&lt;/h2>
&lt;p>When I started this project, I quickly realized that manually managing data collection for this many regions would be impossible. We&amp;rsquo;re talking about thousands of data requests, each taking time to process, with various things that can go wrong along the way.&lt;/p>
&lt;p>So we built something I&amp;rsquo;m really proud of: an intelligent automation system that handles 95% of the work without human intervention. That means 19 out of every 20 data collection tasks happen automatically, even when things go wrong.&lt;/p>
&lt;p>The system is smart about it too. It knows when to speed up data collection, when to slow down to avoid overwhelming the servers, and how to recover when errors happen. We&amp;rsquo;ve achieved 99% data completeness, which means almost every piece of weather data we need actually makes it into our system successfully.&lt;/p>
&lt;h2 id="making-it-production-ready">Making It Production-Ready&lt;/h2>
&lt;p>The biggest challenge was taking CarbonCast from a research project that worked on my laptop to something that could run reliably for weeks without me babysitting it. This meant building in all the boring but crucial stuff that makes software actually work in the real world.&lt;/p>
&lt;p>We created a comprehensive error handling system that can automatically recover from 95% of the problems it encounters. Network hiccups, server timeouts, data format changes – the system handles these gracefully and keeps running.&lt;/p>
&lt;p>There&amp;rsquo;s also a real-time monitoring dashboard that shows exactly what&amp;rsquo;s happening across all regions. I can see which areas are collecting data successfully, which ones might be having issues, and get alerts if anything needs attention. It&amp;rsquo;s like having a mission control center for carbon data.&lt;/p>
&lt;h2 id="the-dashboard-mission-control-for-carbon-data">The Dashboard: Mission Control for Carbon Data&lt;/h2>
&lt;p>Let me show you what this monitoring system actually looks like. We built a comprehensive web dashboard that gives us real-time visibility into everything that&amp;rsquo;s happening:&lt;/p>
&lt;p>
&lt;figure >
&lt;div class="d-flex justify-content-center">
&lt;div class="w-100" >&lt;img alt="Dashboard Overview" srcset="
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-overview_hucf78c7d7b58d9515d431a2744915c5c5_523170_def2a560c75da61de5422b7a6a6dbc38.webp 400w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-overview_hucf78c7d7b58d9515d431a2744915c5c5_523170_5fcfb689e6283d1720e50da81cfb540f.webp 760w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-overview_hucf78c7d7b58d9515d431a2744915c5c5_523170_1200x1200_fit_q75_h2_lanczos_3.webp 1200w"
src="https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-overview_hucf78c7d7b58d9515d431a2744915c5c5_523170_def2a560c75da61de5422b7a6a6dbc38.webp"
width="760"
height="456"
loading="lazy" data-zoomable />&lt;/div>
&lt;/div>&lt;/figure>
&lt;em>The main dashboard showing real-time system metrics and status across all regions&lt;/em>&lt;/p>
&lt;p>The dashboard shows key metrics at a glance – total requests, completion rates, and active regions. But it goes much deeper than that. You can drill down into individual requests to see their complete lifecycle:&lt;/p>
&lt;p>
&lt;figure >
&lt;div class="d-flex justify-content-center">
&lt;div class="w-100" >&lt;img alt="Request Details" srcset="
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-requests_hu4620d0cbd193aecbbe0c5858e2ba9128_195009_876f419901e0b51127b81f1f37bf33f6.webp 400w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-requests_hu4620d0cbd193aecbbe0c5858e2ba9128_195009_3ae7b2ac3a29478b49913635f43aac19.webp 760w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-requests_hu4620d0cbd193aecbbe0c5858e2ba9128_195009_1200x1200_fit_q75_h2_lanczos_3.webp 1200w"
src="https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-requests_hu4620d0cbd193aecbbe0c5858e2ba9128_195009_876f419901e0b51127b81f1f37bf33f6.webp"
width="760"
height="458"
loading="lazy" data-zoomable />&lt;/div>
&lt;/div>&lt;/figure>
&lt;em>Detailed view of individual data requests showing processing timelines and status&lt;/em>&lt;/p>
&lt;p>Each request card shows everything from the initial request time to when the data becomes available for download. This level of visibility is crucial when you&amp;rsquo;re managing hundreds of data requests across different regions and weather variables.&lt;/p>
&lt;p>The regional analytics view shows how well we&amp;rsquo;re doing across different grid operators:&lt;/p>
&lt;p>
&lt;figure >
&lt;div class="d-flex justify-content-center">
&lt;div class="w-100" >&lt;img alt="Regional Analytics" srcset="
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-regions_hubaa80dcd4d7309dd18fca00b148c0f0f_628115_913b55e9f6633983aaaaf25607ac13bf.webp 400w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-regions_hubaa80dcd4d7309dd18fca00b148c0f0f_628115_950373fdefaf9bd595da010d29c37849.webp 760w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-regions_hubaa80dcd4d7309dd18fca00b148c0f0f_628115_1200x1200_fit_q75_h2_lanczos_3.webp 1200w"
src="https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-regions_hubaa80dcd4d7309dd18fca00b148c0f0f_628115_913b55e9f6633983aaaaf25607ac13bf.webp"
width="760"
height="445"
loading="lazy" data-zoomable />&lt;/div>
&lt;/div>&lt;/figure>
&lt;em>Regional breakdown showing completion status across different electricity grid operators&lt;/em>&lt;/p>
&lt;p>What I&amp;rsquo;m particularly proud of is the error handling dashboard. When things do go wrong (which they inevitably do with any large-scale data system), we can see exactly what happened and how the system recovered:&lt;/p>
&lt;p>
&lt;figure >
&lt;div class="d-flex justify-content-center">
&lt;div class="w-100" >&lt;img alt="Error Management" srcset="
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-errors_hua5a5c30a5cd8b72a26622f5af77b2406_480389_2864a9c4d56dcc6220d2fe406daddc17.webp 400w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-errors_hua5a5c30a5cd8b72a26622f5af77b2406_480389_ca1e5cbfdb24da4f1e6531c7be2eed54.webp 760w,
/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-errors_hua5a5c30a5cd8b72a26622f5af77b2406_480389_1200x1200_fit_q75_h2_lanczos_3.webp 1200w"
src="https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250803-tanushsavadi/dashboard-errors_hua5a5c30a5cd8b72a26622f5af77b2406_480389_2864a9c4d56dcc6220d2fe406daddc17.webp"
width="760"
height="254"
loading="lazy" data-zoomable />&lt;/div>
&lt;/div>&lt;/figure>
&lt;em>Error tracking and resolution system showing 100% success rate in region mapping&lt;/em>&lt;/p>
&lt;p>The fact that we&amp;rsquo;re showing &amp;ldquo;No unknown regions found&amp;rdquo; means our coordinate-based region detection system is working perfectly – every weather data request gets properly mapped to the right electricity grid.&lt;/p>
&lt;h2 id="the-technical-foundation">The Technical Foundation&lt;/h2>
&lt;p>Under the hood, we&amp;rsquo;ve built what I&amp;rsquo;d call enterprise-grade infrastructure. The system can run autonomously for weeks, automatically organizing data by region and weather type, managing storage efficiently, and even optimizing its own performance based on what it learns.&lt;/p>
&lt;p>We&amp;rsquo;ve also created comprehensive testing systems to make sure everything works reliably. When you&amp;rsquo;re dealing with data that people might use to make real decisions about when to charge their electric vehicles or run their data centers, reliability isn&amp;rsquo;t optional.&lt;/p>
&lt;p>The architecture follows a modular, service-oriented design with clear separation between data collection, processing, monitoring, and user interfaces. This makes it much easier to maintain and extend as we add new features.&lt;/p>
&lt;h2 id="why-this-matters">Why This Matters&lt;/h2>
&lt;p>All of this infrastructure work might sound technical, but it&amp;rsquo;s directly connected to the original vision: making carbon intensity forecasts accessible to everyone.&lt;/p>
&lt;p>With this foundation in place, we can now provide reliable, up-to-date weather data for carbon intensity forecasting across major electricity grids in North America and Europe. That means developers building carbon-aware applications, companies trying to reduce their emissions, and individuals wanting to time their energy use for lower environmental impact all have access to the data they need.&lt;/p>
&lt;h2 id="whats-next-breaking-down-carboncast">What&amp;rsquo;s Next: Breaking Down CarbonCast&lt;/h2>
&lt;p>The next phase is where things get really exciting. Now that we have this solid data collection foundation, we&amp;rsquo;re going to break down CarbonCast itself into modular components. This will make it easier for developers to integrate carbon intensity forecasting into their own applications, whether that&amp;rsquo;s a smart home system, a cloud computing platform, or a mobile app that helps people make greener energy choices.&lt;/p>
&lt;h2 id="looking-back">Looking Back&lt;/h2>
&lt;p>When I started this project, I knew we needed better infrastructure for carbon data. What I didn&amp;rsquo;t expect was how much we&amp;rsquo;d end up building – or how well it would work. We&amp;rsquo;ve created something that can reliably collect and organize weather data across two continents, handle errors gracefully, and run without constant supervision.&lt;/p>
&lt;p>More importantly, we&amp;rsquo;ve built the foundation that will make it possible for anyone to access accurate carbon intensity forecasts. Whether you&amp;rsquo;re a developer building the next generation of carbon-aware applications or someone who just wants to know the best time to do laundry to minimize your environmental impact, the infrastructure is now there to support those decisions.&lt;/p>
&lt;p>The vision of making carbon data accessible and actionable is becoming reality, one automated data collection at a time.&lt;/p>
&lt;h2 id="impact-beyond-research">Impact Beyond Research&lt;/h2>
&lt;p>This work builds directly on the foundation of Multi-day Forecasting of Electric Grid Carbon Intensity using Machine Learning, transforming research into practical, real-world infrastructure. We&amp;rsquo;re not just making carbon intensity forecasts more accurate – we&amp;rsquo;re making them accessible to everyone who wants to reduce their environmental impact.&lt;/p>
&lt;p>The open-source nature of CarbonCast means that anyone can run, contribute to, and benefit from this work. Whether you&amp;rsquo;re a developer building carbon-aware applications, a policymaker working on grid decarbonization strategies, or a sustainability-conscious individual looking to reduce your carbon footprint, the tools are now there to make informed, impactful choices.&lt;/p>
&lt;p>Looking ahead, I&amp;rsquo;m excited to see how this infrastructure will enable the next generation of carbon-aware computing and smart energy decisions.&lt;/p></description></item><item><title>CarbonCast</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250710-tanushsavadi/</link><pubDate>Thu, 10 Jul 2025 00:00:00 +0000</pubDate><guid>https://deploy-preview-1007--ucsc-ospo.netlify.app/report/osre25/ucsc/carboncast/20250710-tanushsavadi/</guid><description>&lt;p>As part of the &lt;a href="https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre25/ucsc/carboncast">CarbonCast project&lt;/a>, my &lt;a href="https://summerofcode.withgoogle.com/programs/2025/projects/7yvAix3k" target="_blank" rel="noopener">proposal&lt;/a> under the mentorship of Professor Abel Souza aims to build an API that makes carbon intensity forecasts more accessible and actionable.&lt;/p>
&lt;p>Under the mentorship of Professor Abel Souza, my proposal is centered around building upon CarbonCast to create an API to enable user access and utilization of energy data in optimizing their electricity consumption. Before diving into the details of the project, I’d like to share a bit about my background.&lt;/p>
&lt;h2 id="about-me">About Me&lt;/h2>
&lt;p>Hi, I’m Tanush—a rising senior at the University of Massachusetts Amherst, majoring in Computer Science and Mathematics and graduating in Spring 2026. Currently, I’m an AI Intern for the Commonwealth of Massachusetts Department of Unemployment Assistance, where I’m developing an end-to-end retrieval-augmented generation (RAG) chatbot on AWS.&lt;/p>
&lt;p>In the past, I’ve contributed to CarbonCast in a different capacity, designing a user interface to help visualize carbon intensity forecasts. I also worked at MathWorks as a Machine Learning Intern, where I collaborated in an AGILE environment to design and deploy predictive models that improved precision torque control and dynamic responsiveness in motor-driven robotic and industrial systems.&lt;/p>
&lt;p>I’m excited to bring these experiences to this year’s GSoC project, where I’ll be building tools to make carbon data more accessible and actionable for everyone.&lt;/p>
&lt;h2 id="what-is-carboncast">What is CarbonCast?&lt;/h2>
&lt;p>CarbonCast is a Python-based machine-learning library designed to forecast the carbon intensity of electrical grids. Carbon intensity refers to the amount of carbon emitted per kilowatt-hour (kWh) of electricity consumed. Developed in Python, the current version of CarbonCast delivers accurate forecasts in numerous regions by using historical energy production data of a particular geographical region, time of day/year, and weather forecasts as features.&lt;/p>
&lt;p>However, there is no easy way to access, visualize, and utilize the data through a standard interface. In addition, much important information is left out and is not available to users. For instance, electricity grids often import electricity from neighboring regions, and so electricity consumption depends on both electricity generation and imports. Moreover, it is imperative for each energy source to utilize a tailored predictive mechanism. Consequently, any carbon optimization solution trying to reduce carbon emissions due to its electricity consumption will benefit more from following a consumption-based carbon intensity signal.&lt;/p>
&lt;p>Unlike other third-party carbon services, CarbonCast’s model is open-sourced, allowing users to study, understand, and improve its behavior. This transparency invites public collaboration and innovation. It also contrasts sharply with proprietary services that often withhold both the logic behind their models and the data they are trained on.&lt;/p>
&lt;h2 id="why-this-matters">Why This Matters&lt;/h2>
&lt;p>Electricity usage is one of the largest contributors to carbon emissions globally. Carbon intensity—the amount of carbon emitted per kilowatt-hour of electricity consumed—varies based on how electricity is generated and demanded (for example, coal versus solar). With better visibility into when the grid is cleaner, individuals and organizations can shift their energy consumption to lower-carbon periods and lower prices. This enables everyday energy optimizations without compromising comfort or productivity.&lt;/p>
&lt;p>By improving CarbonCast’s accessibility and functionality, we are helping people and institutions answer questions like:&lt;/p>
&lt;ul>
&lt;li>When is the best time to charge my EV to reduce environmental impact?&lt;/li>
&lt;li>Can I run my energy-hungry server jobs when the electricity is cheaper?&lt;/li>
&lt;li>How do I actually reduce my emissions without guessing?&lt;/li>
&lt;/ul>
&lt;p>By providing clear, accurate forecasts of carbon intensity, CarbonCast can help users make informed decisions to optimize their energy footprint and reduce emissions without sacrificing convenience or productivity.&lt;/p>
&lt;h2 id="what-im-building">What I’m Building&lt;/h2>
&lt;p>The plan for this summer is to develop the backend API services for CarbonCast. This summer, I’m focused on two major goals:&lt;/p>
&lt;h3 id="geographical-expansion">Geographical Expansion&lt;/h3>
&lt;p>I am extending CarbonCast’s compatibility to support more regional electricity grids. Each model will be customized for local grid behavior and renewable energy characteristics. This involves tuning the model pipeline to adapt to each region’s energy mix, weather patterns, and reporting granularity.&lt;/p>
&lt;h3 id="system-refactoring-and-modularity">System Refactoring and Modularity&lt;/h3>
&lt;p>The original CarbonCast system was built as a research artifact. To refine it into production-grade infrastructure, I am refactoring the codebase to improve modularity. This makes it easier to plug in new regions, update forecasting algorithms, and integrate new data sources.&lt;/p>
&lt;h2 id="impact-beyond-research">Impact Beyond Research&lt;/h2>
&lt;p>The paper that inspired this project, &lt;em>Multi-day Forecasting of Electric Grid Carbon Intensity using Machine Learning&lt;/em>, pioneered the idea of forecasting carbon intensity over multiple days using a hierarchical machine learning model. This goes beyond the typical 24-hour day-ahead models that are common in the industry and allows for better planning and longer-term decision-making.&lt;/p>
&lt;p>CarbonCast builds directly on that foundation by transforming research into practical, real-world infrastructure. It is an open-source library that anyone can run, contribute to, and benefit from. Whether you&amp;rsquo;re a developer building carbon-aware applications, a policymaker working on grid decarbonization strategies, or a sustainability-conscious individual looking to reduce your carbon footprint, CarbonCast provides the tools to make informed, impactful choices.&lt;/p>
&lt;h2 id="looking-ahead">Looking Ahead&lt;/h2>
&lt;p>I am excited to contribute to a project that blends machine learning, systems engineering, sustainability, and public impact. My goal is to help make it easier for everyone to see, understand, and act on their carbon footprint while also providing the &amp;ldquo;visibility&amp;rdquo; people need to take meaningful, informed actions.&lt;/p></description></item></channel></rss>