Human prejudice stretches back millennia, and the seeds of racism and bias that we sowed long ago have now taken root and flourished within artificial intelligence. Bias existed long before machine learning algorithms emerged; whenever society invents a new technology, it inherits the prejudices and discrimination of earlier eras. In the nineteenth century, redlining maps dictated who could receive loans—systematically denying Black Americans access to mortgages, insurance, and other essential financial services. Today’s credit-scoring algorithms still mirror those same exclusions. As AI extends into recruitment, administration, medicine, and the media, alarm bells are sounding: if we do not imbue our machines with ethical values, they will merely magnify our deepest biases. Continue reading As We Code, So We Reap – by Debanjan Borthakur →
I didn’t take it seriously when a friend in the internet security business told me that AI is reshaping the world and our future. Surely that was an exaggeration. Or so I thought until I was recruited to speak about intercultural leadership in the ‘Age of AI’ during a 3-day virtual symposium for SIETAR (Society for Intercultural Education, Training and Research). It was an honor, but also a vital opportunity to learn about AI from researchers and educators around the globe.
This year, I had the opportunity to teach Intergroup Relations at the University of Toronto as a part-time instructor. It was a new and enriching experience. While at the University of Rhode Island, I once took a course titled Non-Violence and Conflict Reconciliation—at the request of a friend. Since then, I’ve been deeply interested in issues of social harmony and justice. The question of how we can build peace in our society has often occupied my thoughts. Initially, the plan was to teach a different subject. But quite unexpectedly, I found myself teaching this course at a time when divisions between groups—across the world—are becoming sharper. Conflicts based on ideologies, religions, and identities continue to shape current political realities. The urgency of improving intergroup relations is not just felt in North America, but equally in India and elsewhere, I was born in India and I closely observe the socio-political issues pertaining to both societies.
It is important to know the difference between being good and being nice. Good people are not always nice. And nice people are not always good. Being nice is easy and being good is fierce hard work.
The question is, do you choose to be a good person or a nice person? Pope Francis, who we lost on Easter Monday chose to be a good person he understood that which is preached in 1 John 3:18, good deeds make a difference, in the vernacular talk is cheap. We are what we do, and good people do good deeds.
ADR (Deborah Levine): Papa Balla, thank you for joining us. You’ve worn many hats in the AI and intercultural world , as the creator of SIETAR AI, the promoter of the Intercultural AI Framework, and a lead contributor to the EU ACT in GPAI. Could you start by telling us what inspired the creation of the Virtual SIETAR AI Symposium?
Papa Balla Ndong: Thank you for having me. The Virtual SIETAR AI Symposium was born from a sense of urgency to ensure that intercultural perspectives are not an afterthought in the AI space, but a foundation. AI is shaping how we work, communicate, and understand the world. Yet, without cultural sensitivity and ethical alignment, it risks deepening global inequities. The symposium is a space where engineers, educators, policymakers, and cultural practitioners can co-create a more inclusive and responsible AI future.
ADR: How does the Intercultural AI Framework inform the structure or strategy of the Symposium?
Ndong: The Framework is the backbone. It’s not just a theory , it’s a methodology that centers on three pillars: intercultural sensitivity, iterative dialogue, and ethical adaptability. Each session in the symposium maps to one of these, whether we’re discussing dataset bias, AI ethics across borders, or the human element in machine learning. We’re not just talking about inclusion; we’re practicing it through multilingual panels, cross-regional collaboration, and time zone-aware scheduling.
ADR: SIETAR AI is still quite new. What role does it play in this initiative?
Ndong: SIETAR AI is our think-and-do tank. It connects interculturalists who may never have imagined themselves working with AI. Through this platform, we’ve trained educators on AI literacy, advised on ethical AI curricula, and collaborated with tech developers to humanize AI systems. For the Symposium, SIETAR AI serves as the bridge between the intercultural field and the technological ecosystem.
ADR: You were also selected as a lead contributor for the EU ACT within GPAI. How has that shaped your perspective on global AI governance?
Ndong: Immensely. Being part of GPAI’s EU ACT group means engaging in the practical drafting of codes of conduct and frameworks that could shape legislation. What I bring to the table, and advocate for — is the recognition of cultural plurality. We must understand that a one-size-fits-all approach to AI ethics won’t work. African values, Asian philosophies, Indigenous epistemologies — they all matter. The Symposium reflects this ethos by offering a platform for those voices to be heard and integrated into AI norms.
ADR: This all sounds very ambitious. What are the main challenges you’ve faced in organizing the Symposium?
Ndong: Time and trust. Coordinating across continents is a logistical puzzle. But even more, gaining the trust of communities who’ve been excluded from tech dialogues takes time. We’re saying: “Your voice is not only valid — it’s vital.” That shift doesn’t happen overnight. We’re learning to listen deeply and build long-term partnerships, not just events.
ADR: Who are some of the key collaborators or participants in this year’s edition?
Ndong: We’re bringing together UNESCO experts on education and technology, grassroots AI developers from Africa and Latin America, European policymakers, and even artists and poets. AI isn’t just technical — it’s deeply cultural and emotional.
ADR: And what would success look like for you, after the Symposium ends?
Ndong: Success is a seed. If someone leaves the symposium with a new partnership, a project idea, or simply the sense that they belong in the AI conversation — then we’ve done our job. We want the Intercultural AI Framework to live beyond documents and symposiums. It must become a living practice.
ADR: Finally, for our readers who might want to get involved, how can they connect?
Ndong: We’re open. Anyone can join the mailing list of SIETAR AI, attend the symposium (many sessions are free), or contribute to our collaborative Intercultural AI Framework. This is a global dialogue, and everyone has a seat at the table.
ADR: Thank you, Papa Balla. Your work is a reminder that technology without culture is incomplete — and that the future of AI must be both human and humane.
Ndong: Thank you — and may we build that future, together.
Note: Deborah will give a presentation on Intercultural Leadership in the Age of AI for the Symposium on Friday, April 11.
Howard Beale had finally reached his breaking point. He was not going to take it anymore. Beale is a fictional character from the film Network (1976) and one of its central characters. He is played by Peter Finch who won a posthumous Oscar for the Beale role.
Here’s an excerpt from his advice for folks to do during those tumultuous times:
“I want you to get mad! I don’t want you to protest. I don’t want you to riot. I don’t want you to write to your congressman because I wouldn’t know what to tell you to write. I don’t know what to do about the depression and the inflation and the Russians and the crime in the street. All I know is that first you’ve got to get mad.I want all of you to get up out of your chairs. I want you to get up right now and go to the window. Open it, and stick your head out and yell, ‘I’M AS MAD AS HELL, AND I’M NOT GOING TO TAKE THIS ANYMORE!’
I was honored to be interviewed by a university’s Holocaust and Genocide Education Center for International Holocaust Remembrance Day. I’ve worked in Holocaust education for more than 40 years, starting before I learned that my dad was a US military intelligence officer assigned to interrogate Nazi POWs. I did know that he’d been a soldier in World War II because as a kid, I found an old photo of him in uniform. Always a curious little critter, I asked, “Daddy, did you kill anyone in the war?” He answered, “No, but I slapped somebody once…the Nazi said that Hitler was great, but should have killed more Jews.”
Bridging Cultures, Economies, and Ethics for Humanity’s Next Frontier in Space
As humanity sets its sights on space, we are not just pushing the boundaries of science and technology, we are testing our ability to build equitable and sustainable societies in uncharted territory. Space exploration presents both extraordinary opportunities and profound ethical dilemmas: Will our expansion beyond Earth mirror historical patterns of exploitation, or will we seize this moment to create a more inclusive and cooperative future?
The global space economy is projected to exceed $1 trillion by 2040 (Morgan Stanley, 2022), fueled by innovations in satellite technology, asteroid mining, and interplanetary travel. But who will benefit from this new frontier? Will space be dominated by the wealthiest nations and corporations, or can we establish frameworks that ensure shared prosperity?
Celebrating Black History Month at Mizpah congregation brought together members of Chattanooga’s Black and Jewish communities with the synagogue’s “Intriguing conversation”. These conversations are facilitated by Jed Mescon, a well-known media figure here in Chattanooga. Jed’s February interview was with John Edwards III, founder of The Chattanooga News Chronicle, our prominent African American newspaper. The flier announcing the event described Edwards as a civil rights hero who uses the typewriter to ensure that people of all colors enjoy the rights and freedoms that we often take for granted.
Are there ever such things? Or threads in the universe strumming, at just the right moment, to begin a new song? These were the thoughts floating through my mind, after connecting in a circle of grandmothers last weekend.
Enter stage right A few hours pass with nine blessed souls: lives connecting for but a moment on the timeline of our lives. Yet profound, they rang as music to my ears, struggling to help loved ones understand the danger of our day, and the need to prepare. I heard about a World War II veteran, one woman’s Father, whose study by commission during and after World War II was to find out, among the Nazis,
“How did it happen? How did so many steer so far awry? And what was the state of mind of the German population by and large, immediately after?”