Budakov Films successfully conducted the Bulgarian Focus Group Interview as part of the Erasmus+ co-funded project Digital Diversity: Crafting Inclusive AI Narratives (D2CIN), Project No. 2025-1-DE02-KA210-VET-000354956. The event, held on 17 October 2025, represented a significant milestone in the project’s Work Package A1.2 – Focus-Group Interviews and Analysis, bringing together UX and accessibility designers to explore the intersection of ethics, inclusivity, and artificial intelligence in education.

About the Interview

The focus group took place in two complementary sessions — a live discussion hosted at Sofia University “St. Kliment Ohridski,” Faculty of Journalism and Mass Communication (Hall 17), and an online panel conducted via Google Meet. Both were moderated by Asst. Prof. Yordan Karapenchev, PhD, Project Manager at Budakov Films Ltd. The hybrid format enabled participation from a diverse group of professionals with expertise in user experience, accessibility, and digital learning design.

Despite being held separately, the live and online sessions revealed remarkably consistent viewpoints. Participants engaged deeply with the central question of how AI-based avatars and adaptive systems can be designed to reflect human diversity, promote equity, and remain transparent and ethical in their behavior. The dialogue emphasized that inclusive AI design is not a purely technical challenge but a cultural and ethical one — where design decisions can either bridge or deepen social divides.

Key Outcomes of the Focus Group

1. Bias Mitigation and Ethics
Participants unanimously highlighted the need to address hidden biases in AI-driven design tools. They noted that artificial intelligence often mirrors the assumptions and limitations of its creators, reinforcing existing stereotypes. The group agreed that UX designers play an essential role in identifying and countering such biases, advocating for ethical reflection throughout the entire design process — from concept sketches to prototype testing. Transparency and user education emerged as key themes: participants stressed that people must understand how AI systems make decisions to feel safe and respected when using them.

2. Inclusive Data Management
The discussion on data management proved to be one of the most engaging parts of the session. Designers questioned how personalization and data storage practices affect user privacy and autonomy. While personalization can enhance accessibility, participants warned that storing sensitive user data — such as identity, gender, or cultural preferences — must always be handled with explicit consent and strong safeguards. The group agreed that inclusivity cannot come at the expense of privacy, calling for clearer data policies and user control over stored information.

3. Inclusive UX and Digital Storytelling
A dynamic conversation unfolded around the design of AI avatars used in educational contexts. Participants explored how visual representation and behavior influence user perception and engagement. They debated whether avatars should appear neutral or reflect diverse cultural and gender identities. The consensus was that inclusive design should not aim for neutrality, but for authenticity and respect. Participants recommended offering customization options that allow users to choose avatars aligning with their comfort and cultural context, thereby promoting diversity through choice rather than prescription.

4. Accessibility Standards and Assistive Technology Integration
Several participants emphasized that accessibility should be integrated from the earliest design stages, not treated as a final adjustment. They discussed the importance of ensuring that AI-driven avatars are compatible with screen readers, captioning tools, and voice navigation. Beyond technical compliance, they advocated for cognitive and sensory accessibility — designing avatars whose tone, pace, gestures, and clarity make digital learning easier to follow for all learners.

5. Speech Evaluation and Adaptive Content
The group also explored how AI-driven speech and adaptive content could enhance or hinder learning experiences. Participants agreed that speech clarity, pacing, and neutrality are crucial to inclusivity. While adaptive content offers personalization potential, it must be transparent and explainable to avoid undermining learner trust. Designers cautioned that automated adaptation without user awareness could lead to confusion and information loss.

6. Explainability and User Control
Participants called for transparent AI systems that can explain their behavior in clear, non-technical language. They warned that excessive customization can overwhelm users, reducing usability and trust. Instead, they proposed user-friendly dashboards offering essential controls and brief explanations of why certain changes occur. The group stressed that understanding builds trust — and trust is fundamental to effective learning with AI.

Outcomes Informing Project Deliverables

The insights from this Bulgarian focus group will directly inform two major project outputs:

  • The Handbook “Accessible AI for Education: A Comprehensive Guide to Inclusive Design and Training”, which will provide educators, trainers, and designers with practical frameworks and real-world examples on how to implement inclusive, ethical AI in vocational education and training. The handbook will cover topics such as bias mitigation, universal design, accessibility integration, and inclusive storytelling.
  • The Personalized Feedback Navigator, a web-based tool that will assess and enhance educators’ and freelancers’ capabilities in delivering unbiased, inclusive AI-based training. Guided by three main criteria — bias detection, accessibility integration, and adaptive content delivery — the Navigator will provide actionable feedback and promote responsible AI use in educational contexts.

A Step Forward for Inclusive AI Narratives

Budakov Films is proud to contribute to the D2CIN project by facilitating dialogue that connects design expertise with ethical reflection. The insights collected in Sofia reaffirm the project’s central mission: to ensure that AI technologies in education are not only intelligent but also inclusive, transparent, and human-centered.

The consortium — led by Mimic Productions GmbH (Germany) in partnership with Budakov Films (Bulgaria) — continues to work collaboratively toward developing tools, guidelines, and resources that empower educators and designers to create equitable digital learning environments.

This focus group marks an important step in that journey. By engaging professionals from the UX and accessibility fields, D2CIN ensures that its outcomes are grounded in real-world practice and contribute to the broader European effort to design AI for everyone — with respect, empathy, and accountability.

For more information about the D2CIN project and future activities, visit: https://d2cin.eu/

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor EACEA can be held responsible for them.
Project No: 2025-1-DE02-KA210-VET-000354956