TRACK CHAIRS
K.D. Joshi
The College of Business
University of Nevada, Reno
1664 N Virginia St
Reno, NV 89557
kjoshi@unr.edu
Nancy Deng
College of Business Administration & Public Policy
California State University, Dominguez Hills
1000 E. Victoria Street
Carson, California, 90747
ndeng@csudh.edu
The latest developments in Information and Communication Technologies (ICT) such as automation and artificial intelligence have transformed our work, workplaces, institutions, societies, and communities. However, the favorable and unfavorable effects of ICTs are not distributed equally or uniformly across all contexts or populations in our society. Marginalized populations such as underrepresented, vulnerable, and underserved communities often bear the greatest burdens of technological change. Simultaneously, technology also provides powerful ways of safeguarding and improving humanity. This track focuses on socio-technical issues in marginalized contexts to not only uncover digital inequities and social injustices (e.g., the problem of bias in algorithmic systems, which gives rise to various forms of digital discrimination), but to find ways to build systems of empowerment through technology (e.g., designing and building technologies via value-sensitive designs).
This track calls for research that mitigates the risks of constructing a future where technological spaces, digital applications, and machine intelligence mirror a narrow and privileged vision of society with its biases and stereotypes. In this track, we create an outlet for all scholars across various disciplines to conduct research that deeply engages ICTs in marginalized contexts. We welcome papers from a range of perspectives, including conceptual, philosophical, behavioral, and design science and beyond.
Opportunities for Fast Track to Journal Publications: Selected minitrack authors of the accepted conference papers by this track will be invited to submit a significantly extended version (min. +30%) of their paper for consideration to be published in one of the following journals. Submitted papers will be fast-tracked through the review process.
This minitrack attracts and presents research on understanding and addressing the discrimination problems arising in the design, deployment and use of artificial intelligent systems.
A technology is biased if it unfairly or systematically discriminates against certain individuals by denying them an opportunity or assigning them a different and undesirable outcome. As we delegate more and more decision-making tasks to computer autonomous systems and algorithms, such as using artificial intelligence for employee hiring and loan approval, digital discrimination is becoming a serious problem.
Artificial Intelligence (AI) decision making can cause discriminatory harm to many vulnerable groups. In a decision-making context, digital discrimination can emerge from inherited prejudices of prior decision makers, designers, engineers or reflect widespread societal biases. One approach to address digital discrimination is to increase transparency of AI systems. However, we need to be mindful of the user populations that transparency is being implemented for. In this regard, research has called for collaborations with disadvantaged groups whose viewpoints may lead to new insights into fairness and discrimination.
Potential ethical concerns also rise in the use of AI that builds on Large Language Models (LLM) such as ChatGPT, the virtual AI chatbot that debuted in November 2022 by the startup OpenAI and reached 100 million monthly active users just two months after its launch. Professor Christian Terwiesch at Wharton found that ChatGPT would pass a final exam in a typical Wharton MBA core curriculum class, which sparked a national conversation about ethical implications of using AI in education. While some educators and academics have sounded the alarm over the potential abuse of ChatGPT for cheating and plagiarism, industry practitioners from legal industry to travel industry are experimenting with ChatGPT and debating on the impact of the AI on the business and future of the work. In essence, a Large Language Model is a deep learning algorithm that trains on large volumes of text. The bias inherited in the data can lead to emerging instances of digital discrimination especially as various LLM based models, e.g., DALL-E, MAKE-A-VIDEO; are trained on data from different modalities (e.g. images, videos, etc.). Furthermore, the lack of oversight and regulations can also prove to be problematic. Given the rapid developments and penetration of AI chatbots, it is important for us to investigate the boundaries between ethical and unethical use of AI, as well as potential digital discrimination in the use of LLM applications.
Addressing the problem of digital discrimination in AI requires a cross-disciplinary effort. For example, researchers have outlined social, legal, and ethical perspectives of digital discrimination in AI. In particular, prior research has called for our attention to research the three key aspects: how discrimination arises in AI systems; how design in AI systems can mitigate such discrimination; and whether our existing laws are adequate to address discrimination in AI.
This minitrack welcomes papers in all formats including empirical studies, design research, theoretical framework, and case studies etc. from scholars across disciplines, such as information systems, computer science, library science, sociology, and law, etc. Potential topics include, but are not limited to:
- AI-based Assistants: Opportunities and Threats
- AI Explainability and Digital Discrimination
- AI Systems Design and Digital Discrimination
- AI Use Experience of Disadvantaged / Marginalized Groups
- Biases in AI Development and Use
- ChatGPT and Ethical Use
- Digital Discrimination in Online Marketplaces
- Digital Discrimination and the Sharing Economy
- Digital Discrimination with Various AI Systems (LLM based AI, AI assistants, etc.)
- Effects of Digital Discrimination in AI Contexts
- Ethical Use/ Challenges/ Considerations and Applications of AI systems
- Literacy of AI Users
- Responsible AI practices to Minimize Digital Discrimination
- Responsible AI Use Guideline and Policy
- Societal Values and Needs in AI Development and Use
- Sensitive Data and AI Algorithms
- Social Perspective of Digital Discrimination
- Trusted AI Applications and Digital Discrimination
- User Experience and Digital Discrimination
Minitrack Co-Chairs:
Sara Moussawi (Primary Contact)
Carnegie Mellon University
smoussaw@andrew.cmu.edu
Xuefei Nancy Deng
California State University, Dominguez Hills
ndeng@csudh.edu
Jason Kuruzovich
Rensselaer Polytechnic Institute
kuruzj@rpi.edu
Across the work force new developments in collaboration tools, digital labor platforms and artificial intelligence are changing the nature of work. Large-scale remote work activities during the COVID pandemic, coupled with ongoing economic uncertainty have accelerated adoption of a wide range of tools and practices that are altering how workers engage with stakeholders. The changing nature of work presents both challenges and opportunities to building more inclusive labor markets.
On the one hand, the changing nature of work allows a variety of tasks to be completed remotely, expanding access to work opportunities by individuals who may be marginalized by distance, access to reliable transportation or care responsibilities. In this manner broader adoption of collaborative tools and digital platforms may enable meaningful employment opportunities to individuals who would otherwise be excluded from the digital workforce. On the other hand, underlying inequities in labor markets, derived from factors such as differing wage rates, ethnic/national origin/racial/religious/gender/sexual orientation-based discrimination, differences in power among stakeholders, varying digital infrastructure across geography or regulatory variability, may be amplified and codified as work processes evolve. Further technology development, such as AI or robotics, may also automate tasks disrupting the number and nature of opportunities for future employment.
This minitrack is focused on issues relating to how the changing nature may become a mechanism for enabling more inclusive work practices. This objective takes many forms, both in examining the socio-technical factors that enable inclusive employment as well as the factors that create barriers to inclusion. We welcome submissions examining factors at any level of analysis, spanning from global or national level factors influencing labor markets, to individual or team level factors influencing work practices. Increasing popular concerns regarding the changing nature of work are centering these topics in our global understanding of labor markets. Increasing oversight by regulatory bodies demonstrate the import for both academia and policy makers to not only understand emerging work conditions but to also articulate the impact of proposed interventions to the changing nature of work on labor markets.
As discussed above, technology is changing labor markets and work practices. While technology may enable greater employment access, it also may foster environments of power asymmetry: new technology may privilege the platform owners who have the power to control the digital work environments (such as the sourcing models, compensation models, and work policies) but disadvantage workers, particularly the marginalized. Thus, we call for research that critically examines current work conditions and policies on the changing nature of work and propose new work processes, platform designs and polices to enhance the digital work environments and foster social inclusion and equity.
Finally, it’s important for both academia and industry to better understand the impact of the post pandemic transformation on changing nature of work. In the long term, technological developments at the intersection of remote work platforms and AI can potentially shape work at different levels. Research on the future of work and the essential skills and abilities of future workforce will update our knowledge and broaden our visions about the next generation of workforce.
Potential issues and topics on the changing nature of work and inclusive labor markets and work practices include, but are not limited to:
- Changing work conditions in technology centered work environments
- Changing nature of work in developing economies
- Changing nature of collective bargaining in a global workforce
- AI impact work labor markets and career pathing
- Diversity, equity, and inclusion in technology enabled work environments
- Discrimination in technology centered work environments
- Impacts of the digital divide on labor markets
- Employment relations in distributed work environments and digital labor platforms
- The engagement of marginalized groups in emerging work environments
- Worker identity and engagement in the changing nature of work
- Psychological aspects of emerging work environments on workers (e.g., Technostress, Well-being)
- Public policy for more equitable work practices
- Social insurance protection for marginalized workers in crowdsourcing
- Legal and regulatory issues in labor relations in changing work environments
- Ethical issues in labor relations in changing work environments
Minitrack Co-Chairs:
Joseph Taylor (Primary Contact)
California State University, Sacramento
joseph.taylor@csus.edu
Lauri Wessel
European University Viadrina Frankfurt and and Norwegian University of Science and Technology
wessel@europa-uni.de
Jan-Hendrik Passoth
Chair for Sociology of Technology
European University Viadrina Frankfurt
passoth@europa-uni.de
In contemporary society, technology access and usage are dominated by colonial power dynamics centering on the needs of people associated with specific demographics and experiences. It resembles a colonialist exercise of control, establishing who gets to use a tool or service and to what extent. For example, marginalized communities’ experience with digital technology within former colonial contexts has been described as what they underwent during colonization. Then, the overarching goal was to assimilate Indigenous communities into Western culture. Much of our research is concerned with investigating digital technology from Western perspectives (e.g., theories and methods developed in the West that are often unfit to explore issues of coloniality). An absence of decolonial methods and theories in the IS literature and social studies has led some researchers to use Western and/or Euro-centric methods to explore and explain social aspects of technology, thus reinforcing a colonial mindset. Scholars have called this a new form of colonization using digital technologies. Decolonization of methods and theories is called for in research.
This minitrack welcomes decolonization research that showcases decolonial perspectives, using local epistemologies such as Indigenous theories and methods, and highlights how decolonial approaches to technology and society can help overcome oppression and contribute to a more pluriversal society.
We invite scholars to consider this minitrack to be a platform to discuss decoloniality. Fundamental questions of interest are: What does it mean to decolonize information systems? How can we challenge present colonial legacies in a digital society and imagine decolonial futures? How can we theorize and develop decolonized technologies using decolonial approaches at the local and global levels? Potential topics include, but are not limited to:
- Decoloniality, critical race issues and technology
- Decolonizing gender and sex through technology
- Decolonizing the curriculum and higher education using technology
- Decolonial approaches to technology design
- Data colonialism and new forms of coloniality using digital technologies
- Data justice and digital activism in decolonial contexts
- Application of decolonial methods and theories (e.g., Kaupapa Māori)
- Application of decolonial philosophies (e.g., Ubuntu)
Minitrack Co-Chairs:
Hameed Chughtai (Primary Contact)
Lancaster University
h.chughtai@lancaster.ac.uk
Sherae Daniel
University of Cincinnati
daniesr@ucmail.uc.edu
Ann Majchrzak
University of Southern California
majchrza@usc.edu
Diversity, equity, and inclusion (DEI) initiatives have taken the forefront as a core value in organizations. Therefore, organizations have focused on elevating DEI in their strategic plans and considering various initiatives to support DEI goals. In recent years, modern technologies have helped to overcome some invisible barriers that prevent people from reaching a space where they can be seen for their talents, skills, and abilities rather than focusing on their distinctive characteristics, such as gender, religion, disability, age, and skin color. The literature on workplace diversity and organizational inclusion acknowledges that diversity is often a prerequisite but not synonymous with inclusion. In inclusive organizations, people of all identities are empowered to contribute to the larger collectives as valued members. Specifically, inclusive technology cultures facilitate and encourage employee engagement, collaboration, and community participation, fostering a greater sense of belonging and loyalty while allowing them to maintain a unique identity.
Digital technologies refer to devices such as personal computers, tablets, electronic tools, systems, virtual reality and the Internet. In addition, digital technologies generate, store or process data. These include social media, online games, multimedia, wearable technologies, and mobile phones, among others. Previous research has provided insights into DEI’s impacts on employees and organizations. Yet, for organizations to fully benefit from technology-driven initiatives that support DEI, it is critical for IS researchers to extend their knowledge and understanding of the development of digital DEI initiatives and their implications and organizational outcomes.
This minitrack draws on the premise of an organization as a socio-technical system. It focuses on the IT workforce, technology tools, and the digital driving forces that promote diversity, equity and inclusion in organizations. As such, research in this minitrack lies at the intersection of multiple disciplines, namely Science, Technology, Organizational Science, Behavioral Science, and Design Science.
This minitrack welcomes theoretical and empirical studies addressing organizational, managerial, technical, and behavioral perspectives on digital DEI business solutions and impacts. Potential issues and topics include, but are not limited to:
- Digital DEI organizational solutions
- Digital social inclusion and organizational culture
- Technology tools for promoting DEI in the organization
- Digital inclusion and the workforce
- Diversity and the IT workforce
- Equity in the IT workforce
- Inclusion in the IT workforce
- Diversity and digital recruitment, hiring, and retention strategies
- Digital DEI and the workforce
- Digital DEI and the organization
- Ethical implications in the use of technology for organizational DEI
- Risk management in digital DEI initiatives
- Methodologies for studying digital DEI in organizations
- Digital organizational strategies and practices associated with DEI
- New frameworks to describe and explain the phenomenon of digital DEI and the organization
- Roles and responsibilities of IS departments in developing and supporting digital DEI initiatives
- The use of technology for organizational DEI goals and objectives
- The dark side of digital DEI initiatives
Minitrack Co-Chairs:
Ester Gonzalez (Primary Contact)
California State University, Fullerton
esgonzalez@fullerton.edu
Sam Zaza
Middle Tennessee State University
sam.zaza@mtsu.edu
Angsana Techatassanasoontorn
Auckland University of Technology
angsana@aut.ac.nz
The digital divide refers to the gap between those who have access to and use of digital technologies and those who do not. The divide and resulting inequities can take a number of forms. Despite significant progress in the adoption and use of information and communication technologies (ICTs), there is still a substantial gap in the levels of digital inclusion and equity between some members of vulnerable populations and other members of society.
Vulnerable populations, which may include but are not limited to youth, the elderly, persons with disabilities, low-income, rural and indigenous communities, marginalized castes, refugees, those who are stateless, and under-served regions in various developing and developed contexts, often face systemic barriers in accessing, adopting, and using ICT. This digital divide has significant social, economic, and political implications and further deepens the potential inequality and exclusion of these populations.
This call for papers invites original research papers, case studies, and review articles that investigate the digital divide and its impact on vulnerable populations, as well as initiatives that address these vulnerabilities, moving towards digital equity and inclusion. We encourage submissions that address, but are not limited to, the following topics:
- Access to ICT: infrastructure, affordability, and availability
- ICT adoption and use: barriers, opportunities, and challenges
- Digital skills and literacy: training and education for vulnerable populations
- Digital inclusion policies and strategies: best practices and lessons learned
- Social and cultural factors: attitudes and perceptions towards ICT
- Gender, race, caste, and ethnicity: intersectionality and the digital divide
- ICT and health: the role of digital technologies in promoting health equity
- ICT and education: the impact of the digital divide on learning outcomes
- ICT and political participation: digital democracy and political engagement
- ICT and economic development: the role of digital technologies in reducing poverty and inequality
- Digital social innovation and digital social intermediation: the role of social intermediaries in leveraging ICTs to addressing SDGs
- Unintended consequences as a result of ICT use or efforts to bridge the digital divide
We welcome interdisciplinary and comparative studies that employ a variety of methods, including but not limited to, qualitative and quantitative research, case studies, experiments, surveys, and mixed methods approaches. We encourage submissions from both established and emerging scholars, including graduate students and practitioners.
Minitrack Co-Chairs:
Israr Qureshi (Primary Contact)
Australian National University
Israr.Qureshi@anu.edu.au
Carmen Leong
University of New South Wales
Carmen.leong@unsw.edu.au
Arlene Bailey
University of the West Indies
arlene.bailey@uwimona.edu.jm
The interplay of Gender and Technology is fundamental in understanding the role gender plays in marginalizing or empowering individuals in the technology space. Accelerating gender balance in technology is a social justice issue. Information Technology (IT) is powering and influencing all aspects of our lives. Therefore, the future will be shaped and controlled by people who know how to use, design, and build technology. Gender balance in the technology space is imperative to ensure that the future of work and life is not decided for individuals who are not well represented in this space. This minitrack is designed to give voice to such research to promote discourse and uncover deep and rich insight into the topic of gender in technology.
This minitrack seeks to attract research that conceptualizes, theorizes, and operationalizes the gender construct as a social identity and not just as a biological sex with a dichotomous category. In addition, we encourage the use of gender-based theories, such as the Individual Differences Theory of Gender and IT, Gender Role Theory to articulate the conceptualization of gender.
This minitrack invites gender-focused analysis of societal, organizational, and individual factors that not only advance our understanding of how gender shapes the technology milieu but also reveal interventions that can help attenuate gender inequities and imbalance. Topics of interest include, but are not limited to:
- Applying the Intersectionality perspective to advance gender analysis in IT research
- Designing “Gender-free” technology
- Feminist perspectives on gender and technology
- Gender analysis of the history of technology
- Gender analysis of the use and consumption of technology
- Gender analysis of design and construction of technology
- Gender attitudes toward technology
- Gender biases and stereotypes in the technology industry
- Gender, identity, and technology use
- Gender imbalance in the technology field
- Gender pay gap in the technology field
- Gender role congruity and technology career pathways
- Gendered nature of technology leadership
- Gendered opportunities and risks of new technologies
- Gendered patterns in the use of new technologies
- Hegemonic masculinity in the technology industry
- Imposter syndrome and women in technology
- New approaches to conceptualizing and operationalizing gender and technology
- Role of power in creating gender equity within the technology fields
- Tech entrepreneurship and gender
- Work-life balance in technology field
Minitrack Co-Chairs:
K.D. Joshi (Primary Contact)
University of Nevada, Reno
kjoshi@unr.edu
Regina Connolly
Dublin City University
regina.connolly@dcu.ie
Ita Richardson
University of Limerick
ita.richardson@ul.ie
Mina Jafarijoo
Stockton University
Mina.Jafarijoo@stockton.edu
Social justice is the belief that everyone deserves fair and equal treatment. ICT and Social justice research refer to studies about actions that promote equal rights, equal opportunities, and equal treatment as well as studies that the use of ICT that uncover social injustice. The guiding principles of social justice are human rights, access to basic elements such as food, water, shelter, safety, education, and opportunities, equal participation in decision-making, and equity to reduce systemic barriers to ensure every individual is treated fairly and equitably.
Criminal justice is an umbrella term that refers to the laws, procedures, institutions, and policies at play before, during, and after the commission of a crime. Criminal justice has two central ideas: Suspects, convicted criminals, and victims of crime all have certain rights. Criminal conduct should be prosecuted and punished by the state following set laws.
Given that ICTs are involved in the way that we as individuals carry out our work and leisure activities, in the way that we organize ourselves in groups, in the forms that our organizations take, in the type of societies we create, and thus in the future of the world, ICTs are thus deeply implicated in criminal and social justice, as information systems inscribe our understanding of the world, and our attendant prejudices. We know for instance, that algorithms embedded in AI can amplify racism, sexism, ableism and other forms of discrimination.
This minitrack invites submissions of original work concerning the intersection of information systems research with social and criminal justice. Studies about the uses of ICT to uncover inequalities and injustice, and to promote justice at all levels (i.e. racial, climate, age, etc. ) and equality and equity for those with fewer privileges such as people of color (POC), refugees and asylum seekers, unhoused, and people with disabilities. We also welcome critical approaches to these topics. We also welcome submissions of research-in-progress as well as those that are practically oriented yet have the potential to make significant contributions to the research in this area. The relevant topics for the minitrack include, but are not limited to, the following areas:
- ICT and social inclusion
- ICT and racial injustice
- ICT and equality and equity
- ICT and climate justice
- ICT and voting rights
- ICT and gun violence
- ICT and income gap
- ICT and gun violence
- ICT and ageism
- The application of datafication and AI in criminal justice
- Bias and discrimination in algorithms
- AI and predictive policing
- Big data and risk assessment
- Facial recognition in criminal justice
- Dataveillance, security, and privacy
- Datafication and AI applications in border control
- Feminist perspectives in data justice
Minitrack Co-Chairs:
Cathy Urquhart (Primary Contact)
Manchester Metropolitan University
C.Urquhart@mmu.ac.uk
Indira Guzman
California State Polytechnic University Pomona
irguzman@cpp.edu
Angela D. R. Smith
University of Texas at Austin
angela.smith@ischool.utexas.edu
Yingqin Zheng
Royal Holloway, University of London
Yingqin.Zheng@rhul.ac.uk
Information and communication technologies (ICT) have changed the practices used by illicit actors and those seeking to interdict illegal or exploitative activity. Advances in information technology have led to new business models and business practices that expand illicit actors’ markets, increase the risk and scope of victimization, and allow illicit actors to evade detection. ICT also enables illicit actors to gain access to marginalized groups, who are often already vulnerable to exploitation. Law enforcement agencies and governments react to these adaptations by illicit actors, often by trying to comply with or reform aging laws and policies that fail to keep up with information technology and criminal behavior. In addition, many legitimate organizations are seeking opportunities to use ICT to identify and mitigate the use of their products or services by illicit actors in an effort to protect their stakeholders and organization from harm or exploitation.
This minitrack welcomes research exploring the intersection of ICT and illicit activity that has a physical world component and/or the use of ICT by illicit actors that targets or exploits marginalized groups. We welcome a range of methodological approaches as well as conceptual, theoretical, empirical, and methodological papers. We are interested in research from a range of perspectives, such as how criminal behavior is altered due to ICTs, interventions by law enforcement, civil agencies, NGOs, or businesses to detect, disrupt, or dismantle illicit networks, and the role of information technology to serve and support victims of crime and exploitation in gaining access to justice. Topics of interest include, but are not limited to:
- Changing behaviors among illicit actors as a result of ICT
- Technology and its impact on human trafficking activity, prevention, or interdiction
- Exploitation of marginalized groups with information technology
- Use of ICT to provide marginalized groups better access to justice
- Use of information technology by government agencies, non-profit organizations, or businesses to detect, disrupt, or dismantle illicit networks
- Development of interventions to detect and prevent illicit activity that intersects
- the physical and online worlds
- Use of ICT to protect marginalized and vulnerable populations from exploitation
- Protection of individuals in marginalized groups from illicit actors through policies,
- technology design, or interventions
- Linking illicit actors’ activities with their cyber identities and/or personal identity
Minitrack Co-Chairs:
Stacie Petter (Primary Contact)
Wake Forest University
petters@wfu.edu
Gisela Bichler
California State University San Bernardino
gbichler@csusb.edu
Michael Fullilove
DeliverFund
Michaelhfullilove@gmail.com
Laurie Giddens
University of North Texas
laurie.giddens@unt.edu
STEM fields offer numerous exciting and lucrative career opportunities, but unfortunately, these fields are often characterized by a lack of diversity and inclusivity. This minitrack will focus on addressing barriers to equity and social justice in STEM education and careers, with a particular emphasis on underserved populations.
The minitrack will explore new angles and approaches to promoting equity and social justice in STEM education and careers, including the following topics:
- Cultivating interest and fostering access: To promote equity in STEM education, it is important to cultivate interest in these fields among underserved populations, and to provide access to high-quality STEM education. This session will explore innovative programs and initiatives that are designed to introduce underserved populations to STEM fields and provide them with the tools and resources they need to succeed.
- Implementing Inclusive pedagogical and curricular innovations and practices in STEM education: This session aims to provide a forum for scholars and practitioners to share their experiences and insights on how to create a more inclusive and equitable learning environment in STEM education. We encourage submissions from researchers, educators, and practitioners from a variety of disciplinary backgrounds, as well as those who work with learners of all ages, from K-12 to postsecondary education.
- Addressing systemic barriers: Despite progress in promoting diversity and inclusivity in STEM fields, there are still many systemic barriers that prevent underserved populations from achieving success. This session will explore these barriers and discuss strategies for addressing them, including policy changes, community outreach, and mentorship programs.
- Advancing opportunities: In order to promote equity in STEM fields, it is important to create opportunities for underserved populations to succeed. This session will explore innovative approaches to advancing opportunities for underserved populations in STEM careers, including internship and apprenticeship programs, career development workshops, and entrepreneurship initiatives.
- Amplifying diverse voices: It is essential to amplify the voices of diverse individuals in STEM fields, including those from underserved populations. This session will explore the importance of diversity in STEM fields and highlight successful initiatives that have increased representation and promoted inclusivity.
- Engaging industry partners: Industry partners can play a crucial role in promoting equity and social justice in STEM education and careers. This session will explore partnerships between academic institutions, community organizations, and industry partners to create meaningful opportunities for underserved populations in STEM fields.
- Data and Assessment: Using data and assessment to track progress and ensure accountability for promoting equity and social justice in STEM education is key. This session will explore best practices for collecting, analyzing, and using data to inform decision-making and measure the effectiveness of initiatives and interventions.
Overall, this minitrack will provide a unique opportunity to explore innovative approaches to promoting equity and social justice in STEM education and careers, with a focus on underserved populations. Participants will gain insights into successful initiatives, discuss best practices, and build partnerships to advance these important goals.
Minitrack Co-Chairs:
Curtis Cain (Primary Contact)
Howard University
caincc@howard.edu
Asli Yagmur Akbulut
Grand Valley State University
akbuluta@gvsu.edu
Benyawarath Yaa Nithithanatchinnapat
Penn State Behrend
benyawarath@psu.edu
At its best, social media platforms connect individuals from all over the world to facilitate learning, the spread of creative ideas, inclusivity, and access to resources. At its worst, however, social media platforms marginalize individuals through manipulation, exclusion, and exploitation across all age groups and demographics. Academic social media research in marginalized contexts is becoming increasingly important from both practical and theoretical perspectives. Research concerning both the “bright side” and “dark side” of social media for equity and inclusion is needed to help information systems research be an agent for empowerment and social change. In this space, there are many important, yet unanswered, research questions.
This minitrack invites papers on all types of social media platforms investigating the positive and negative aspects of social media in marginalized contexts. Our goal with this minitrack is to facilitate a scholarly discussion of social media use in order to identify innovative approaches to maintain a safe and productive online environment that creates social well-being for the greater good. We welcome empirical, theoretical, or position papers. Topics of interest include, but are not limited to, the following:
- Spread of hatred and racism on social media platforms
- Biases associated with de-platforming and re-platforming on social media platforms
- How social media platforms may be used to promote or stifle sustainable initiatives through (un)civil discourse
- Spear phishing attacks and other security threats targeted towards vulnerable groups based on their social media activities
- The use of analytics on social media platforms to hinder or facilitate the spread of social movements
- Artificial intelligence enabled marketing on social media platforms that might be perceived as discriminatory and inequitable
- Social media use that facilitates or inhibits the spread of human trafficking
- Cyberbullying and how to defend against it on social media platforms
- The spread of gender (in)equity and gender (in)equality on social media platforms
- How social media provides emotional support for marginalized groups pursuing STEM careers
- How perceived inequities in the judicial systems are communicated and discussed on social media
- Ethical, legal issues, and freedom of speech issues on social media platforms
- How social media might spread social (in)justice
- Impact that social media has on law enforcement or other government agencies, which may be both positive and negative
- The role that social media plays in the dissemination of fake news and disinformation campaigns
- Crowdfunding for marginalized groups and differential patterns of lending
- The role that social media plays in promoting or inhibiting the cancel culture
- How social media facilitates or inhibits different types of social movements
- The differential role that social media plays in depression, isolationism, and disconnectedness for under-represented groups
- The use of social media to reduce poverty for marginalized groups
The above list of suggested topics is not an all-inclusive list. We encourage authors to define marginalized contexts broadly. We welcome all theoretical and methodological approaches.
Minitrack Co-Chairs:
Tom Mattson (Primary Contact)
University of Richmond
tmattson@richmond.edu
Jie Ren
Fordham University
jren11@fordham.edu
Qin Weng
University of Arkansas
qinweng@uark.edu