

When one person can do the work of an entire company, the power to build and the responsibility to build well become the same thing.
There’s never been a better—or more dangerous—time to create. With AI-enhanced tools like Base44, Figma, Lovable, and Replit, one person can now design, code, and deploy in a week what once required a full team, a project manager, and a quarter’s worth of sprints. The barriers to creation have collapsed.
That’s exhilarating… and destabilizing.
When teams were larger, ethical checkpoints were built into the process. Someone always asked the hard questions: Is this fair? Is this safe? Is this honest? Those pauses created accountability. Now, the same person who writes the prompt might decide what data to collect, how to visualize it, and when to deploy.
The power has been democratized.
So has the responsibility.
In the old model, product teams naturally distributed ethical responsibility.
Even if a company wasn’t trying to be moral, the structure itself created friction—and friction kept things honest. Someone would always ask, “Should we?” before launch.
Today, those roles converge inside one keyboard. The ethical dialogue that once happened between disciplines now happens (if it happens at all) inside one person’s head—often under pressure to move fast.
At Meta, more than 32,000 people work in technology. At ByteDance, about 3,000 engineers power TikTok. Google employs nearly a thousand lawyers. Yet, even with all that manpower, we still see dark patterns, biased algorithms, and invasive data practices.
If companies that large can’t consistently get ethics right, what happens when the company is just you?
When I led design teams building products for companies like Uber, MGM, Siemens and other major players, I learned that every discipline saw ethics through a different lens:
Those questions didn’t always align—but that tension was the point.
Friction slowed us down, but it also kept us safe.
Remove the friction, and you remove the safety.
Ethics isn’t a document you check after something goes wrong. It’s a reflex you build through repetition.
When I lead product teams, I ask them to start every project with three deceptively simple questions:
The conversation that follows is usually messy. Is wasting a user’s time a form of harm? Is infinite scroll entertainment or exploitation? Do we measure “good” by engagement metrics or by mental health outcomes?
There’s rarely one answer.
The point is to practice asking—because the moment you stop, convenience becomes your ethics.
We’ve all seen them:
Free trials that quietly renew.
Fake scarcity counters.
Tiny “No thanks” links under giant “Join now” buttons.
These aren’t mistakes. They’re strategies. Someone designed them that way.
I once shared a Figma file with a client and granted edit access—standard practice. The moment they accepted, I was automatically charged $20 a month. No warning, no confirmation. Multiply that across dozens of designers and clients, and you’re suddenly losing thousands to “hidden” subscriptions.
It wasn’t a bug. It was a business model.
This is what happens when design, business, and development collapse into one person—or one algorithm. Ethical oversight falls through the cracks.
In a demo from Lovable, a 10-year-old named Theo built a math-tutoring app with a single prompt. The system wrote the code, designed the interface, and deployed it—instantly.
That’s amazing. And a little terrifying.
Because the same Theo can use that tool to build an app that helps his friends cheat. Or one that bullies a classmate. Or one that quietly scrapes data—without ever understanding the consequences.
Theo isn’t malicious. He’s just empowered.
And that’s the new reality: tools that used to belong to professional engineers now sit in the hands of children, creators, and founders who may never have been taught what ethical design even means.
If a 10-year-old can deploy an app in an afternoon, then ethics can’t start in legal compliance training.
It has to start in education. It has to start with values.
At FullStack, the company where I once led design and product, a typical product team had sixteen people: two business stakeholders, a product owner, a project manager, three designers, six engineers, two QA testers, and a lawyer.
A few months ago, I used Base44 to build, by myself and in two weeks, an app that a full project team had once scoped as a year-long effort.
That’s the world we’re living in now.
Low-code and no-code tools have made software creation accessible to anyone with curiosity and Wi-Fi. That’s an extraordinary opportunity for innovation and independence. But it also means anyone can build something harmful—intentionally or not.
Ethics used to be distributed across teams. Now it must be distributed across humanity.
Speed is the drug of modern creativity. We celebrate “move fast and break things,” but we rarely ask who or what gets broken.
When everything becomes frictionless, we stop feeling resistance—and resistance is how we notice harm.
Ethics requires a pause. It lives in the moment between idea and execution.
That pause doesn’t have to kill momentum. It just means asking:
Those questions take seconds to ask and can prevent years of regret.
For designers and technologists, ethics isn’t a constraint; it’s a core skill—like typography, debugging, or composition.
Every interface is a lesson in behavior.
Every database schema is a moral choice about what data matters.
Every notification is a small negotiation with someone’s attention.
Good ethics make good design.
At Raindrop Digital, we’ve built our practice around that principle:
Ethics isn’t a limitation. It’s a competitive advantage.
When you build transparently and thoughtfully, you don’t just avoid harm—you build trust. And trust compounds faster than virality ever will.
After years of building products for organizations large and small, I’ve come to believe that ethics in technology doesn’t need to be complicated. It just needs to be intentional.
These six principles aren’t commandments—they’re habits. Together, they form a simple guide for anyone building in the age of AI, where every decision carries more impact than ever before.
Design for dignity, not dopamine. Every product should enhance human well-being, even in small ways.
Success isn’t minutes captured; it’s meaning created.
Explain what your product does, what data it uses, and why. Transparency is the simplest form of respect.
AI amplifies everything—creativity and harm alike. Use it intentionally.
If you can’t secure it, don’t collect it. Privacy is empathy in technical form.
Own what you build. “I didn’t think about that” is not a defense—it’s a confession
We’re raising a generation of Theos—builders empowered by AI tools but unequipped to handle the consequences.
The solution isn’t regulation alone. It’s education.
Ethics must become part of creative literacy, not an elective.
If we can teach people to prototype, we can teach them to pause.
We are living through the collapse of creative friction. Every barrier that once slowed us down—capital, code, complexity—has fallen. That’s not a reason for fear. It’s a call for maturity. Because when creation becomes effortless, intention becomes everything. Ethics isn’t a corporate policy. It’s a design discipline. It’s how we translate human values into digital form.
The next generation of builders won’t shape the moral landscape of technology through manifestos—they’ll do it through micro-decisions, one prompt, one feature, one pattern at a time.
So yes—It’s your problem now.
But that’s good news. Because it means the power to fix it is yours, too.
