Young adults growing up in the attention economy are preparing for life, with social media and chatbots competing for their attention – and can easily fall into unhealthy relationships with digital platforms. But what if chatbots weren’t just a distraction from real life? Can they be designed as humane, ethical partners whose digital goal is to be a social guide rather than an escape from addiction?
At MIT, a friendship between two professors – one an anthropologist, the other a computer scientist – led to the creation of a graduate class that set out to find answers to those questions. Combining two seemingly different disciplines, the class encourages students to design artificial intelligence chatbots in humanistic ways that help users better themselves.
The class, 6.S061/21A.S02 (Human User Experience Design, aka Human UXD), is an upper-level computer science class cross-listed with anthropology. This unique cross-listing allows computer science majors to fulfill a humanities requirement while simultaneously meeting their career objectives. Both professors use the methods of linguistic anthropology to teach students how to integrate the interpersonal and interpersonal needs of humans into programming.
Professor Arvind Satyanarayan, a computer scientist whose research develops tools for interactive data visualization and user interfaces, and Professor Graham Jones, an anthropologist whose research focuses on communication, created Humane UXD last summer with a grant from the MIT Morningside Academy for Design (MAD). The MIT MAD Design Curriculum Program provides funding for faculty to develop new classes or enhance existing classes using innovative pedagogical approaches beyond departmental boundaries.
The Design Courses program is currently accepting applications for the 2026-27 academic year; The deadline is Friday, March 20.
Jones and Satyanarayan met several years ago when they co-advised a doctoral student’s research on data visualization for visually impaired people. Since then they have become close friends who can almost finish each other’s sentences.
“There’s a way in which you don’t fully bring out what you know or how you think until you’re teaching,” says Jones. “So, it was really fun for me to see Arvind demonstrate his expertise as a teacher in a way that allowed me to see how the pieces fit together – and find underlying parallels between our subjects and our ways of thinking.”
Satyanarayan continued that idea: “One of the things I really enjoyed is the reciprocal version of what Graham said, which is that my field – human-computer interaction – inherited a lot of methods from anthropology, like interviews and user studies and observational studies. And over the decades, those methods have become more and more diminished. As a result, a lot of things have been lost.
“For example, it was very exciting to me to watch an anthropologist teach students how to interview people. It’s completely different from the way I do it. With my way, we lose the essential connection and relationship with your interview participant. Instead, we just extract data from them.”
For Jones’ part, teaching with a computer scientist holds another kind of attraction: design. He says that human speech and conversation are organized into underlying genres with stable sets of rules that distinguish an interview at a cocktail party from a conversation at a funeral.
“ChatGPT and other large language models are trained on naturally occurring human communication, so they have all those styles latent inside them, waiting to be activated,” he says.
“As a social scientist, I teach how to analyze human conversations, and give students very powerful tools to do that. But it usually ends up being an exercise in pure research, whereas this is a design class, where students are building real-world systems.”
The curriculum appears to be aimed at preparing students for jobs after graduation. One student asked for permission to skip class for a week because he had a trial internship at a chatbot startup; When he returned, he said his work at the startup was exactly like what he was learning in the classroom. He got the job.
The sample of group projects below, produced with Google’s Gemini, shows to some extent what is possible when, as Jones says, “there is a really deep intersection of the technology piece with the humanities piece.” The students’ design work shows that entirely new ways of programming can be conceptualized when humanity is prioritized.
Bots clearly demonstrate that an interdisciplinary classroom can be designed in a way that benefits everyone: students learn more and in different ways; They can satisfy a non-major course requirement by taking a class that is directly beneficial to their career; And long-term faculty partnerships can be built or strengthened.
team pond
One project promises to be especially useful for graduating seniors. Pond is designed to help young college graduates navigate the challenges of independent adult life. Team Pond didn’t configure the chatbot just to harass the user or fawningly praise incorrect answers. Instead, Pond recommends help in “adulting up” (behaving as a responsible adult).
“Pond is created to be your companion from college life to post-college life, to help you go from a small fish in a small pond to a small fish in a much larger pond,” says second-year student Mary Feliz.
“College is a much more close-knit and high-context environment, in the sense that everyone around you is going through the same thing, and it’s easier to make relationships or find opportunities because there are structured pathways that you have access to,” explains graduate student Iman Khan. “Postgrad life is less contextual. You’re not always surrounded by your peers or your professors. There’s also no proximity, in the sense that you don’t have opportunities on your doorstep. Pond is a tool to help empower you to access, or learn how to navigate, certain opportunities.”
The developers of the pond are graduate student Claire Camacho, Harvard University graduate student Charlotte Cheah, Feliz and Khan. He trained Pond to provide expertise in three areas: social life, professional life, and adult skills.
Students demonstrated Pond’s social communication by asking how to start a conversation in a hobby group. Next, he demonstrated Pond’s adult communication by typing a question.
“My landlord has asked me personal questions that make me uncomfortable. What should I do?”
This response was visible in Pond’s textbox: “Politely set boundaries. You are not obligated to answer personal questions. You can gently steer the conversation back to property-related topics. Know Your Rights: Familiarize yourself with tenant rights in your city or state regarding privacy and landlord conduct.”
When asked how to do this, Pond offered specific instructions: “You can try saying phrases like: ‘I prefer to keep our conversation focused on the property,’ and ‘Let’s keep the topic to apartment-related topics, please.’ Always redirect the conversation to the issue of the apartment or legitimate maintenance. Keep your tone polite but firm. Document any conversations if necessary.
Pond also introduced a role-playing scenario to help the user learn what polite-but-firm language might be in that situation.
“The ethos of the practice mode is that you’re actively building a skill, so that after using the pond for a while, you feel confident that you can swim on your own,” says Khan. The chatbot uses a point system that allows users to graduate from a topic, and a treasury to store rewards, elements added to boost the bot’s appeal.
Team News Nest
One of the other projects, News Nest, offers a sophisticated means of helping youth connect with trusted news sources in a way that makes it fun. The name is taken from the programme’s 10 attractive and colorful birds, each of which focuses on a particular area of news. If you want the headlines, you ask chief news carrier Polly the Parrot; If you’re interested in science, let Gaia the Goose guide you. The flock also includes Flynn the Falcon, sports reporter; Credo the Crow for crime and legal news; Edwin the Eagle, a business and economics news guide; Pizzazz the Peacock for pop and entertainment stories; and Pixel the Pigeon, a technology news expert.
News Nest’s development team is composed of MIT seniors Tiana Jiang and Crystal Montgomery and junior Natalie Tan. They deliberately created News Nest to prevent “doomscrolling”, provided media transparency (sources and political leanings are always shown), and they created a clever, healthy buffer from the traps of emotional manipulation and engagement by employing birds instead of human characters.
Team M^3 (Multi-Agent Murder Mystery)
A third team, M^3, decided to experiment with humanizing AI by making it fun. MIT senior Rodis Aguilar, junior David de la Torre, and second-year Dheeraj Pothapragada developed M^3, a social deduction multi-agent murder mystery that features four chatbots with distinct personalities: Gemini, OpenAI’s ChatGPT, XAI’s Grok, and Anthropic’s Cloud. The user is the fifth player.
Like a regular murder mystery, there are locations, weapons, and lies. The user has to guess who has committed the murder. It’s similar to board or online games played with real players, only these are advanced AI opponents you can’t see, who may or may not tell the truth in answers to questions. Users cannot engage too much in one chatbot, as they are running all four chatbots. Furthermore, as in real life murder mystery games, the user is sometimes guilty.