Mark Jaggassar was SVP of Products & Analytics at Rhythmos before the company joined Electron in October 2025. He continues in this role for Electron, bringing expertise in deploying DER management and grid analytics at scale to US utilities.
Tell us a bit about your background.
My background spans both software development and electrical power systems. About 15 years ago, I looked at the forces reshaping the electricity sector – digitization, decarbonization, and decentralization – and recognized that the industry would need people who could bridge the worlds of Ohm’s law with Moore’s law.
Since then, I’ve had the opportunity to deploy smart grid initiatives at over 30 utilities across the US, Canada, and the UK. My focus is on how the industry can intelligently manage distributed energy resources at scale.
I’ve designed and deployed control strategies for both front-of-meter and behind-the-meter DER at kilowatt and megawatt scale, spanning traditional communication protocols and modern IoT-based aggregation platforms.
Most recently, I built the Rhythmos grid analytics platform from the ground up.
Rhythmos, now ElectronCompass, built momentum by turning existing utility data into insights. How does that philosophy expand now inside Electron?
At Rhythmos, our focus was meeting utilities where they are in their data journey. We believed that regardless of starting point, there is always meaningful value to be unlocked from readily available, underutilized utility data – value that helps utilities proactively manage their grid-edge assets in a way that works for everyone.
Joining Electron has simply expanded that remit. In the context of electrification, our data-driven insights help utilities answer a core question: “when and where do I need to pay attention?”
With Electron, we can now address both the forward-looking and backward-looking sides of that challenge. The key leading question being “how do I make cost-prudent decisions on DERs and non-wires alternatives?” and the key lagging question “what do I do about it?”
Our grid analytics sit naturally in the middle, bridging those two perspectives to make them stronger together.
In your view, what’s the biggest gap between how people talk about distribution grids and how utilities actually operate them?
It’s worth acknowledging upfront that the US utility landscape is too diverse for any single answer to fit all cases.
Comparing a large, vertically integrated utility with millions of meters and best-in-class operations and planning with a small public utility focused on least-cost last-mile delivery is an apples-to-oranges exercise.
Their operational, digital, and analytical starting points will differ enormously – and that shapes both their grid modernization strategies and what is realistically achievable.
That said, there is a significant gap between what advanced software promises to deliver and what is actually possible today or in the short-term.
We can unpack that disconnect by asking some relatively simple questions: does it work in practice, will it scale, what is the realistic time to value, and will it stand up to regulatory scrutiny? Through that lens, we can start disentangling the hype from reality.
How do you balance innovation with credibility, especially in a sector where regulators and reliability come first?
Innovation in the energy sector needs to treat regulation and reliability as foundational priorities. Unlike typical tech culture, the “move fast and break things” mindset is incompatible with critical infrastructure that operates in real time.
Progress is absolutely achievable, but any advanced solution deployed must enhance system reliability and hold up to regulatory scrutiny.
When it comes to AI, meaningful discussion requires distinguishing between planning and operational applications. On the planning side, statistical foundations built long before machine learning existed – including error quantification and confidence bounds – must not be abandoned in pursuit of more sophisticated models.
Improved prediction accuracy alone is not enough; understanding the range of potential outcomes remains essential in an industry where reliability is paramount. This isn’t a sign of model weakness, but rather one of honesty that preserves well proven engineering principles.
On the operational side, even highly accurate AI-driven tools can introduce risk if they come with high prediction variance, potentially causing downstream impacts that raise serious questions of accountability.
Rather than chasing cutting-edge solutions for their own sake, the industry is better served by honestly embracing both the benefits and the limitations.
That balanced approach tends to surface credible, high-value use cases far more quickly.
