We can’t talk about ethical AI if we treat people in unethical ways
By Leyya Sattar
The Indiaspora ‘Forum for Good’ 2025 brought together 500 leaders from 34 countries to tackle some of the biggest questions of our time, from responsible AI solutions, climate resilience, global health, geopolitics, and the future of diaspora-led investment. It was a packed few days with talks, panels, and networking around the theme "New Models for the World", a call to leave a better legacy for future generations.
Scenes from the Indiaspora: Forum for Good. An inspiring conference exploring the future of AI, trade, climate, and diaspora-led impact. Grateful to witness and learn from thought-provoking talks by (starting from top left Vivek Oberoi, His Excellency Sheikh Nahayan Mabarak Al Nahayan, MR Rangaswami, Pritha Venkatachalam, Isabella Sreyashii Sen, Shalini Govil-Pai, Cherie Blair and more.
The BAPS Hindu Mandir temple, Abu Dhabi
For me, the biggest takeaway came long before we sat down at the conference. On day one, we visited the Sheikh Zayed Grand Mosque and the newly built BAPS Hindu Mandir Temple, where we were welcomed by a private speech from the Swami-ji (a spiritual teacher and leader), who offered a moment of pause in what would become a whirlwind of talks about AI and innovation. He said:
"If this conference is going to focus on financial capital, and AI, trade, investment...we cannot forget about the most important thing which is human capital."
I'm not in love with the term "human capital" but his point landed. We're racing to build smarter machines, faster systems and scalable products, but we still haven't figured out how to build humane, fair, and ethical workplaces, which has always been our goal at Other Box.
Throughout the week, I kept returning to his message. I met fascinating and inspiring people working across AI, tech, and VC-backed startups. I had so many great conversations with brilliant minds, but many conversations circled back to growth, speed, and disruption. And yet, from my experience of working with hundreds of businesses behind the scenes, workplaces are burning out their teams, ignoring inclusion, and treating inclusive workplace cultures as a "nice to have” instead of its core foundation.
What is the difference between AI and us?
Yes, AI has the potential to support humanity in incredible ways. From advancing healthcare to tackling climate challenges, streamlining systems, and even removing time-wasting bureaucracy (like my fellow panellist Utkarsh Saxena, founder of Adalat AI building technological innovations for justice systems in India and the global south), and needless paperwork in all aspects of life. BUT only if we stay rooted in ethics, accountability, and care for the people behind the progress.
The biggest difference between AI and us is our morals and ethics. The AI and tech industry already shows us what happens when we sideline people. From glaring gender gaps (shoutout to my new friend Ishani Singh, founder of GirlsRuleAI empowering girls from around the world with the tools to build solutions using AI), racial bias embedded in algorithms, and growing environmental costs, we’re only beginning to understand, with new research showing how data centres are draining local water supplies. Meanwhile, we shame consumers for their carbon footprint while giving massive billionaire tech companies a free pass.
We cannot talk about ethical AI if we treat people in unethical ways.
As someone from the South Asian diaspora, it was powerful to be in spaces where ancient wisdom, modern innovation and lived experiences intersected. It reminded me that our roots and our values must travel with us into the future we’re building.
Swami-ji also went on to talk about perception shifting depending on where you stand.
"When you're at home, you pray for your family. But when I take you to the moon, you pray for the world."
That shift in perspective is everything. That shift in perspective is everything. It's what allows us to zoom out and ask better questions, like: Who's being left out of these innovations? Who's paying the price for our progress? And what kind of world are we really building?
I don’t have the answers, and I definitely can’t fix these issues, but I can be part of the solution by asking better questions and helping my clients do the same. The future of AI and our shared humanity depends on all of us asking better questions together.
That’s what Other Box has always been about. Slowing down enough to look at the bigger picture. Holding up a mirror. Supporting companies in becoming more conscious of how they treat people, make decisions, and what kind of impact they want to have. We help organisations move beyond performative DEI to build cultures prioritising psychological safety, inclusion, and long-term impact.
It can feel like you're already behind if you're not keeping up.
Being a bootstrap solo founder, going at my own pace, and staying grounded in my values isn’t always easy. Especially when I'm surrounded by outside pressure, noise, new AI tools launching every week, endless social media content and pressure to scale faster, do more and be everywhere. It can feel like you're already behind if you're not keeping up.
But I've realised that pace is a privilege. And choosing to build slowly, intentionally, and with integrity is a radical act in a world obsessed with speed. It allows me to stay close to the people and purpose behind my work. It helps me show up fully for my clients and create work that makes a difference, not just noise for the algorithm.
Real change doesn't come from moving fast and breaking things. It comes from pausing long enough to ask: Why are we building this? Who is it for? Who does it harm? Who does it help?
—-
If your organisation is thinking about the future of work, AI, inclusion, or workplace culture and doesn’t want to leave people behind in the process, get in contact with Other Box. We help teams build ethical, human-centred workplaces where innovation and integrity go hand in hand.