
Building Maike
Explore how Madington engineered Maike, a multi-agent AI system that transforms unstructured brand data into performant, production-ready digital ads.
Behind the Code at Madington: Building Maike
In this behind the scenes post, Linus Forsell (Business Partner & Web Developer at Madington) shares insights into the engineering decisions, agent orchestration, and complex UX challenges involved in building Maike—our automated creative and campaign-builder platform.
The digital advertising industry changes fast. For over a decade, we at Madington have focused on crafting digital ad experiences. We learned early on that relying on manual production for every banner and rich media execution was a bottleneck—not just for us, but for media owners and advertisers needing to stay relevant. This realization led us to build our own Creative Management Platform, which we call Station.
Building Station allowed us to simplify the creative process by introducing Dynamic Creative Optimization (DCO). For us, DCO isn't just about rendering dynamic text; it's about reusing code efficiently, enabling quick market alignment as narratives change, and establishing long-term cost effectiveness. By reducing supply chain complications and minimizing production churn, Station gave our clients a clearer path to market.
It gave us a solid structural foundation. But as campaign demands grew and the need for relevance increased, we needed a faster, smarter way to populate those structures with creative assets.
Naturally, we looked toward the industry's current obsession: Artificial Intelligence. AI is frequently portrayed as a silver bullet that will solve workflow inefficiency, generate assets on demand, and somehow reduce our carbon footprint in the process.
But as developers, designers, and problem solvers, we know technology rarely works like magic. It requires careful engineering, thoughtful architecture, and respect for the user experience.
That’s why we built Maike.
The original idea for Maike was much smaller in scope. We simply wanted an internal mockup generator. We often found that we couldn't always make time to quickly generate high-fidelity mockups for our clients' sales pitches or immediate needs. We thought an automated tool could alleviate that pressure.
But as we began building, the project evolved. The result is a full-fledged production system integrated directly into our ecosystem. Rather than just spitting out static images, Maike is designed to serve as a self-serve platform that digests brand data, analyzes web pages, and repurposes that content into actual, production-ready display ads. Behind the scenes, Maike is not a single prompt pulling strings. It is an orchestrated system of specialized components working together within a practical engineering framework.
In this article, I want to take you behind the code to explore how we engineered Maike—from our architectural decisions and agent orchestration to how we build fluid UX and handle complex state management.
Workflows and Agent Orchestration
To understand how Maike operates, we first need to define "workflows" and "agents." In our platform, you can think of an agent as a specialized digital worker—essentially an isolated inference instance equipped with specific tools, context, and a narrow set of visual constraints. Rather than relying on a single AI model to handle everything—which frequently leads to hallucinated data, broken layouts, and bloated code—we process user intents through a custom agent orchestration layer.
Our architecture relies on horizontal, specialized agents. When a user requests a new campaign based on a specific URL, the orchestration layer delegates tasks. We have a "Creative Director" agent that makes decisions on layout hierarchies. We have a "Layout Context" agent that decides whether the incoming data fits a hero-branding layout or a multi-item carousel. We also have agents focused on prompt structure and copy refinement. By routing tasks to specialized agents, we maintain quality control and predictable outputs.
A "workflow" is simply the sequence of these agents collaborating. One agent finishes its data extraction, hands its findings off to the next, and the process continues until a production-ready creative is authored.
Amplifying a Weak Signal
A digital ad is only as effective as the data behind it. One of the main engineering challenges we faced while building Maike was ensuring the intelligence gathering process—how we understand a brand and its intent—was accurate and nuanced.
Getting good brand data is difficult. When you point Maike at a client's website, it is essentially walking into an unstructured room full of noise. Websites contain navigation menus, footer links, privacy policies, and pop-ups.
Our extraction workflows are engineered to filter out this noise and find what is relevant. As our agents analyze a web page, they look for the primary focus. Is this a single-product spotlight? Is it a category page listing different shoes? Is it a brand-awareness piece focusing on company ethos?
Understanding the user's intent is important. Our systems comb through the markup—evaluating DOM structures, schema data, and visual hierarchies—to gather specific product or service information, pulling out imagery, pricing details, and key selling propositions.
But grabbing data isn’t always a perfect process. Sometimes, the "signal" is weak. A brand might have a compelling narrative but lack high-resolution product photography, or their provided assets might be awkwardly cropped or heavily compressed. If the gathering process yields incomplete assets, we cannot just force them into a layout.
When Maike encounters a weak signal, it shifts from analytical extraction into generative synthesis. Our agents evaluate the provided assets and determine if they need upscaling to meet modern display standards. If a product photo lacks context, Maike can composite it, placing a standalone item into a brand-aligned lifestyle environment. In cases where visual assets are entirely missing but the narrative is strong, the system can generate new, photorealistic imagery to support the core message.
By bridging the gap between available reality and generated assets, we ensure a weak signal doesn't result in a messy creative.
But this generation is tightly constrained. Maike must understand the visual rhythm and brand voice. Our agents scan for design tokens—primary and secondary color palettes, typographic hierarchies, button styling, and spacing constraints. It learns how the brand speaks: are they playful and colloquial, or strictly professional?
By reconstructing the visual rhythm and semantic tone of a brand's website, Maike ensures the creatives it generates feel authentic. The automated banner doesn't look like a generic template populated with scraped data; it feels like an extension of the brand's own carefully crafted design system.
Authoring Performant Creatives
Connecting intelligence gathering to the final output is where developer craftsmanship matters most. At Madington, we don’t just care about making things look good; we care about how they perform.
Once our workflows have gathered the brand insights, extracted the visual rhythm, and refined the copy, the final step is authoring code that renders highly performant ads. This is a crucial distinction. We are not generating static image mockups; Maike outputs production-ready, dynamic structural code.
The creatives must load fast and they cannot waste data. As we've learned with our proprietary streaming technology, data waste directly correlates with CO2 emissions and poor user experiences. Browsers have strict limits on how much data ads can load.
When Maike authors a creative, it applies rigorous engineering standards to the output. The code is optimized to ensure assets are lightweight, layouts shift dynamically without causing reflow bottlenecks in the browser, and animations execute smoothly without taxing the device's CPU.
This ties directly into our approach to sustainability. It wouldn’t make sense to develop a tool aimed at generating assets if the resulting creatives were bloated, resource-intensive, and detrimental to publisher websites. By ensuring the code Maike authors is clean and hyper-optimized, we maintain a balance between automation and environmental responsibility. We are reducing production time, allowing clients to recycle assets, and ensuring the final data delivery is tight on the requested network.
Fluid UX and Complex State Handling
Building a multi-agent orchestration engine is one thing; making it feel intuitive and responsive to the end user is another challenge.
When a user initiates a workflow in Maike, they aren’t waiting hours for an email with a zip file. They experience the creative process happening in real-time, via a conversational canvas interface.
Bringing this to life required serious effort in handling complex UI states. An AI generation workflow might take several minutes and stream thousands of varied data points back to the user's browser. This data stream includes conversational text, structural hints for layouts, extracted imagery, and hidden tool executions.
To maintain a fluid User Experience, our frontend architecture employs resilient stream watchdogs and a robust heartbeat system. When an AI data stream is active, the browser and our backend maintain a constant pulse. If a user's connection drops for a fraction of a second, or if a data stream stalls mid-generation, our watchdogs detect the anomaly. Instead of splashing an error screen or losing the user's place, the system drops the dead connection and automatically reconnects in the background, resuming the generation where it left off.
Furthermore, because these data streams are dense, we built strict UI deduplication systems. As multidimensional payloads arrive, our state handlers merge data gracefully to prevent layout jitter or duplicated interface elements.
But perhaps the most distinct UX decision we made was how we visualize the "thinking" process. Instead of showing the user a terminal-style text dump of what the AI is analyzing, we transform these raw data streams into visual "Workflow Progress."
As our specialized agents execute their tools—fetching a site, retrieving brand palettes, stripping background imagery—our state managers catch these hidden operations and render them as graphical progress bars and status indicators. The user gets a clear visualization of the behind-the-scenes actions. They are kept in the loop, understanding what steps are being taken to author their creative, which builds trust in the automated process.
Conclusion
As an industry, we are stepping into a complex era. The intersection of generative automation, performance optimization, and sustainability is where the most meaningful innovations will happen.
Building Maike wasn't about bolting an AI chatbot onto a dashboard. It was about taking years of experience in creative production, streaming technologies, and web performance, and applying it to a new paradigm.
By utilizing specialized agent orchestration, focusing on noise-free data extraction, and upholding strictly optimized coding standards for the creatives we generate, we've created a system that scales effectively.
Most importantly, by wrapping this complex state handling in a fluid, resilient, and transparent user experience, we ensure that automation serves the user, rather than the other way around.
At Madington, we’re continuously looking for smarter ways to evolve our platform while carefully considering its impact on emissions and sustainability. It’s all about finding a balance between automation and responsibility. With Maike, we've taken a real step toward that ideal, proving that when handled with care, well-written code can still feel like magic.
Interested in what Maike can do for you?
Whether you are looking to generate rapid, high-fidelity mockups for sales pitches, or you want to introduce entirely new, automated production capabilities for your in-house team or media agency, we'd love to show you how Maike works.
Get in touch with us at Madington to learn more.
