TRACK CHAIRS

K.D. Joshi

The College of Business
University of Nevada, Reno
1664 N Virginia St
Reno, NV 89557
kjoshi@unr.edu

Nancy Deng

College of Business Administration & Public Policy
California State University, Dominguez Hills
1000 E. Victoria Street
Carson, California, 90747
ndeng@csudh.edu

The latest developments in Information and Communication Technologies (ICT) such as automation and artificial intelligence have transformed our work, workplaces, institutions, societies, and communities. However, the favorable and unfavorable effects of ICTs are not distributed equally or uniformly across all contexts or populations in our society. Marginalized populations such as underrepresented, vulnerable, and underserved communities often bear the greatest burdens of technological change. Simultaneously, technology also provides powerful ways of safeguarding and improving humanity. This track focuses on socio-technical issues in marginalized contexts to not only uncover digital inequities and social injustices (e.g., the problem of bias in algorithmic systems, which gives rise to various forms of digital discrimination), but to find ways to build systems of empowerment through technology (e.g., designing and building technologies via value-sensitive designs).

This track calls for research that mitigates the risks of constructing a future where technological spaces, digital applications, and machine intelligence mirror a narrow and privileged vision of society with its biases and stereotypes. In this track, we create an outlet for all scholars across various disciplines to conduct research that deeply engages ICTs in marginalized contexts. We welcome papers from a range of perspectives, including conceptual, philosophical, behavioral, and design science and beyond.

Opportunities for Fast Track to Journal Publications: Selected minitrack authors of the accepted conference papers by this track will be invited to submit a significantly extended version (min. +30%) of their paper for consideration to be published in one of the following journals. Submitted papers will be fast-tracked through the review process.

This minitrack attracts and presents research on understanding and addressing the discrimination problems arising in the design, deployment and use of artificial intelligent systems.

Digital discrimination refers to discrimination between individuals or social groups due to lack of access to Internet-based resources or in relation to biased practices in data mining and inherited prejudices in a decision-making context. A technology is biased if it unfairly or systematically discriminates against certain individuals by denying them an opportunity or assigning them a different and undesirable outcome As we delegate more and more decision-making tasks to computer autonomous systems and algorithms, such as using artificial intelligence for employee hiring and loan approval, digital discrimination is becoming a serious problem.

Artificial Intelligence (AI) decision making can cause discriminatory harm to many vulnerable groups. In a decision-making context, digital discrimination can emerge from inherited prejudices of prior decision makers, designers, engineers or reflect widespread societal biases. One approach to address digital discrimination is to increase transparency of AI systems. However, we need to be mindful of the user populations that transparency is being implemented for. In this regard, research has called for collaborations with disadvantaged groups whose viewpoints may lead to new insights into fairness and discrimination.

Potential ethical concerns also rise in the use of AI that builds on Large Language Models (LLM) such as ChatGPT, the virtual AI chatbot that debuted in November 2022 by the startup OpenAI and reached 100 million monthly active users just two months after its launch. Professor Christian Terwiesch at Wharton found that ChatGPT would pass a final exam in a typical Wharton MBA core curriculum class, which sparked a national conversation about ethical implications of using AI in education. While some educators and academics have sounded the alarm over the potential abuse of ChatGPT for cheating and plagiarism, industry practitioners from legal industry to travel industry are experimenting with ChatGPT and debating on the impact of the AI on the business and future of the work. In essence, a Large Language Model is a deep learning algorithm that trains on large volumes of text. The bias inherited in the data can lead to emerging instances of digital discrimination especially as various LLM based models, e.g., DALL-E, MAKE-A-VIDEO; are trained on data from different modalities (e.g. images, videos, etc.). Furthermore, the lack of oversight and regulations can also prove to be problematic. Given the rapid developments and penetration of AI chatbots, it is important for us to investigate the boundaries between ethical and unethical use of AI, as well as potential digital discrimination in the use of LLM applications.

Addressing the problem of digital discrimination in AI requires a cross-disciplinary effort. For example, researchers have outlined social, legal, and ethical perspectives of digital discrimination in AI. In particular, prior research has called for our attention to research the three key aspects: how discrimination arises in AI systems; how design in AI systems can mitigate such discrimination; and whether our existing laws are adequate to address discrimination in AI.

This minitrack welcomes papers in all formats, including empirical studies, design research, theoretical framework, case studies, etc. from scholars across disciplines, such as information systems, computer science, library science, sociology, law, etc. Potential topics include, but are not limited to:

  • AI-based Assistants: Opportunities and Threats
  • AI Explainability and Digital Discrimination
  • AI Literacy of users
  • AI Systems Design and Digital Discrimination
  • AI Use Experience of Disadvantaged / Marginalized Groups
  • Biases in AI Development and Use
  • Digital Discrimination in Online Marketplaces
  • Digital Discrimination and the Sharing Economy
  • Digital Discrimination with Various AI Systems (LLM based AI, AI assistants, etc.)
  • Effects of Digital Discrimination in AI Contexts
  • Ethical Use/ Challenges/ Considerations and Applications of AI systems
  • Generative AI (e.g., ChatGPT) Use and Ethical Implications
  • Organizational Perspective of Digital Discrimination
  • Responsible AI Practices to Minimize Digital Discrimination
  • Responsible AI Use Guideline and Policy
  • Societal Values and Needs in AI Development and Use
  • Sensitive Data and AI Algorithms
  • Social Perspective of Digital Discrimination
  • Trusted AI Applications and Digital Discrimination
  • User Experience and Digital Discrimination
Minitrack Co-Chairs:

Sara Moussawi (Primary Contact)
Carnegie Mellon University
smoussaw@andrew.cmu.edu

Jason Kuruzovich
Rensselaer Polytechnic Institute
kuruzj@rpi.edu

Minoo Modaresnezhad
University of North Carolina Wilmington
modaresm@uncw.edu

Xuefei Nancy Deng
California State University, Dominguez Hills
ndeng@csudh.edu

At its best, social media connects individuals worldwide to facilitate learning, the spread of creative ideas, inclusivity, and access to resources. At its worst, however, social media marginalizes individuals and groups through manipulation, exclusion, and exploitation across all groups and demographics. Marginalized contexts refer to any situation or context in which certain individuals or groups are treated insignificantly and/or pushed to the margins of society and rendered powerless. Academic social media research in marginalized contexts is becoming increasingly important from both practical and theoretical perspectives. Research concerning both the “bright side” and “dark side” of social media for equity, inclusion, justice, and marginalized contexts is needed to help information systems research be an agent for social change. In this space, there are many important, yet unanswered, research questions.

This minitrack invites papers on all types of social media, investigating their positive and negative aspects in marginalized contexts. Our goal with this minitrack is to facilitate a scholarly discussion of social media use in order to identify innovative approaches to maintain a safe and productive online environment that creates social well-being for the greater good. We encourage a broad definition of “marginalized contexts”, by which, we refer to any situation or context with an unequal power dynamic or group membership. We welcome empirical, theoretical, or position papers. Topics of interest include, but are not limited to, the following:

  • Spread of hatred and racism on social media
  • Biases associated with de-platforming and re-platforming on social media
  • How social media may be used to promote or stifle sustainable initiatives through (un)civil discourse
  • Spear phishing attacks and other security threats targeted towards vulnerable groups based on their social media activity
  • The use of analytics on social media to hinder or facilitate digital (in)equity and social (in)justice
  • The negative unintended consequences of using artificial intelligence on social media
  • Social media use that facilitates or inhibits the spread of human trafficking
  • Cyberbullying on social media and defense mechanisms
  • The spread of gender inequities and gender equality on social media
  • How social media provides emotional support for marginalized groups
  • How perceived inequities in the judicial systems are communicated and discussed on social media
  • Ethical, legal issues, and freedom of speech issues on social media
  • How social media might spread social (in)justice
  • Impact that social media has on law enforcement or other government agencies, which may be both positive and negative
  • The role that social media plays in the dissemination of fake news, disinformation, and conspiracy theories
  • Crowdfunding for marginalized groups and differential patterns of lending
  • The role that social media plays in promoting or inhibiting the cancel culture
  • How social media facilitates or inhibits different types of social movements
  • The differential role that social media plays in depression, isolationism, and disconnectedness for under-represented groups

The above list of suggested topics is not an all-inclusive list. We encourage authors to define digital equity, social justice, and marginalized contexts broadly. We welcome all theoretical and methodological approaches.

Minitrack Co-Chairs:

Tom Mattson (Primary Contact)
University of Richmond
tmattson@richmond.edu

Jie Ren
Fordham University
jren11@fordham.edu

Qin Weng
Baylor University
qin_weng@baylor.edu

Across the work force new developments in collaboration tools, digital labor platforms and artificial intelligence are changing the nature of work. Large-scale remote work activities have spread widely during the COVID pandemic and are likely to remain an integral part of how many companies manage work. Additionally, ongoing economic uncertainty and crises have accelerated adoption of a wide range of tools and practices that are altering how workers engage with stakeholders. The changing nature of work presents both challenges and opportunities to building more inclusive labor markets.

On the one hand, the changing nature of work allows a variety of tasks to be completed remotely, expanding access to work opportunities by individuals who may be marginalized by distance, access to reliable transportation or care responsibilities. In this manner broader adoption of collaborative tools and digital platforms may enable meaningful employment opportunities to individuals who would otherwise be excluded from the digital workforce. On the other hand, underlying inequities in labor markets, derived from factors such as differing wage rates, ethnic/national origin/racial/religious/gender/sexual orientation-based discrimination, differences in power among stakeholders, varying digital infrastructure across geography or regulatory variability, may be amplified and codified as work processes evolve. Further technology development, such as AI or robotics, may also automate tasks disrupting the number and nature of opportunities for future employment.

This minitrack is focused on issues relating to how the changing nature may become a mechanism for enabling more inclusive work practices. This objective takes many forms, both in examining the socio-technical factors that enable inclusive employment as well as the factors that create barriers to inclusion. We welcome submissions examining factors at any level of analysis, spanning from global or national level factors influencing labor markets, to individual or team level factors influencing work practices. Increasing popular concerns regarding the changing nature of work are centering these topics in our global understanding of labor markets. Increasing oversight by regulatory bodies demonstrate the import for both academia and policy makers to not only understand emerging work conditions but to also articulate the impact of proposed interventions to the changing nature of work on labor markets.

We call for research that critically examines current work conditions and policies on the changing nature of work and propose new work processes, platform designs and polices to enhance the digital work environments and foster social inclusion and equity. In this regard, our minitrack answers the call by the IS community to enhance the DEI in relation to IS and IT development, use, and impacts.

Finally, it’s important for both academia and industry to better understand the impact of the post pandemic transformation on changing nature of work. In the long term, technological developments at the intersection of remote work platforms and AI can potentially shape work at different levels. Research on the future of work and the essential skills and abilities of future workforce will update our knowledge and broaden our visions about the next generation of workforce.

Potential issues and topics on the changing nature of work and inclusive labor markets and work practices include, but are not limited to:

  • Diversity, equity, and inclusion in technology enabled work environments
  • Employment relations in distributed digital organizations
  • Ethical and regulatory issues in the labor relations in changing work environments
  • The changing nature of work in developing economies
  • The engagement of marginalized groups in emerging work environments
  • Algorithmic based discrimination in technology centered work environments
  • AI impact work labor markets and career pathing
  • Changing work conditions
  • Impacts of the digital divide on labor markets
  • The changing nature of collective bargaining in a global workforce
  • Worker identity and engagement in the changing nature of work
  • Psychological aspects of emerging work environments on workers (e.g., Technostress, Well-being)
Minitrack Co-Chairs:

Joseph Taylor (Primary Contact)
California State University, Sacramento
joseph.taylor@csus.edu

Lauri Wessel
European University Viadrina, Frankfurt and and Norwegian University of Science and Technology
wessel@europa-uni.de

Jan-Hendrik Passoth
European University Viadrina, Frankfurt
passoth@europa-uni.de

Systems of all sizes, shapes, and types have always contained abuses of power. How we understand and mitigate such abuse is one way to create better systems that support people to be their best, optimizing performance and teamwork. This minitrack aims to explore how the global system sciences community, in collaboration with other diverse stakeholders worldwide (and synergistic disciplines), can leverage the power of history, social science, and information technology to glean valuable insights that can be made actionable and harnessed to combat abuses of power in systems; addressing such abuses early can potentially prevent real-world negative implications on people navigating those systems, which have proven to be especially detrimental to members of marginalized communities.

This minitrack seeks to advance conceptual, historical, and empirical approaches to combating abuses of power in systems broadly defined. For guidance, we define abuses of power in systems as occurring when influence and control are misused by one or more individuals within systems to the detriment of others. We are open to a variety of research methods and theoretical approaches encompassed within systems sciences, focused on abuse in systems and/or understanding systemic abuses of power. Papers should bring a transdisciplinary perspective and should be written for a broad and diverse target audience. Potential topics for exploration could include, but should not be limited to:

  • How to advance prevention of, and/or active surveillance for “big abuses” of power rooted in systemic enablers, incentives, and cultures in academia and in other societal institutions.
  • How to engender transparency in reporting, measurement, and amelioration of systemic abuses of power.
  • How to reimagine and drive cultural transformation to eradicate abuses of power at organizational and system levels in academia and beyond, including how to recognize and surmount barriers they pose to innovation.
  • How to envision a path to learning systems underpinning continuous improvement in ameliorating systemic abuses of power in academia and beyond.
  • How to raise social consciousness vis-à-vis such issues so as to mobilize society toward ultimately eradicating them.
  • How to empower Davids to outmaneuver abusive systems of Goliaths supported by “Armies of Enablers”.
  • How to understand the social, economic, and (physical and mental) health consequences of abuse.
  • How to collaboratively formulate a research agenda to guide and propel this global scientific community forward.
Minitrack Co-Chairs:

Melissa Ocepek (Primary Contact)
University of Illinois
mgocepek@illinois.edu

Joshua Rubin
University of Michigan Medical School
Josh@JoshCRubin.com

Rebecca Kush
Learning Health Community
rkush@catalysisresearch.com

In contemporary society, technology access and usage are dominated by colonial power dynamics centering on the needs of people associated with specific demographics and experiences. It resembles a colonialist exercise of control, establishing who gets to use a tool or service and to what extent. For example, marginalized communities’ experience with digital technology within former colonial contexts has been described as what they underwent during colonization. Then, the overarching goal was to assimilate Indigenous communities into Western culture.

Much of our research is concerned with investigating digital technology from Western perspectives (e.g., theories and methods developed in the West that are often unfit to explore issues of coloniality). An absence of decolonial methods and theories in the IS literature and social studies has led some researchers to use Western and/or Euro-centric methods to explore and explain social aspects of technology, thus reinforcing a colonial mindset. Scholars have called this a new form of colonization using digital technologies. Decolonization of methods and theories is called for in research.

This minitrack welcomes decolonization research that showcases decolonial perspectives, using local epistemologies such as Indigenous theories and methods, and highlights how decolonial approaches to technology and society can help overcome oppression and contribute to a more pluriversal society. We invite scholars to consider this minitrack to be a platform to discuss decoloniality. Fundamental questions of interest are: What does it mean to decolonize information systems? How can we challenge present colonial legacies in a digital society and imagine decolonial futures? How can we theorize and develop decolonized technologies using decolonial approaches at the local and global levels? Potential topics include, but are not limited to:

  • Data justice and digital activism in decolonial contexts
  • Decoloniality, critical race issues and technology
  • Decolonizing gender and sex through technology
  • Decolonial ethics and Artificial Intelligence
  • Decolonial approaches to technology design
  • Data colonialism and new forms of coloniality
  • Application of Indigenous methods and theories (e.g., Kaupapa Māori)
  • Application of Indigenous philosophies (e.g., Ubuntu)
Minitrack Co-Chairs:

Hameed Chughtai (Primary Contact)
Lancaster University
h.chughtai@lancaster.ac.uk

Sherae Daniel
University of Cincinnati
daniesr@ucmail.uc.edu

Pitso Tsibolane
University of Cape Town
pitso.tsibolane@uct.ac.za

Diversity, equity, and inclusion (DEI) initiatives have taken the forefront as a core value in organizations. Therefore, organizations have focused on elevating DEI in their strategic plans and considering various initiatives to support DEI goals. In recent years, modern technologies have helped to overcome some invisible barriers that prevent people from reaching a space where they can be seen for their talents, skills, and abilities rather than focusing on their distinctive characteristics, such as gender, religion, disability, age, and skin color. The literature on workplace diversity and organizational inclusion acknowledges that diversity is often a prerequisite but not synonymous with inclusion. In inclusive organizations, people of all identities are empowered to contribute to the larger collectives as valued members. Specifically, inclusive technology cultures facilitate and encourage employee engagement, collaboration, and community participation, fostering a greater sense of belonging and loyalty while allowing them to maintain a unique identity.

Digital technologies refer to devices such as personal computers, tablets, electronic tools, systems, virtual reality and the Internet. In addition, digital technologies generate, store or process data. These include social media, online games, multimedia, wearable technologies, and mobile phones, among others. Previous research has provided insights into DEI’s impacts on employees and organizations. Yet, for organizations to fully benefit from technology-driven initiatives that support DEI, it is critical for IS researchers to extend their knowledge and understanding of the development of digital DEI initiatives and their implications and organizational outcomes.

This minitrack draws on the premise of an organization as a socio-technical system. It focuses on the IT workforce, technology tools, and the digital driving forces that promote diversity, equity and inclusion in organizations. As such, research in this mini-track lies at the intersection of multiple disciplines, namely Science, Technology, Organizational Science, Behavioral Science, and Design Science.

The Call for Papers welcomes theoretical and empirical studies addressing organizational, managerial, technical, and behavioral perspectives on digital DEI business solutions and impacts. Potential issues and topics include, but are not limited to:

  • Digital DEI organizational solutions.
  • Digital social inclusion and organizational culture.
  • Technology tools for promoting DEI in the organization.
  • Digital inclusion and the workforce.
  • Diversity and the IT workforce.
  • Equity in the IT workforce.
  • Inclusion in the IT workforce.
  • Diversity and digital recruitment, hiring, and retention strategies.
  • Digital DEI and the workforce.
  • Digital DEI and the organization.
  • Ethical implications in the use of technology for organizational DEI.
  • Risk management in digital DEI initiatives.
  • Methodologies for studying digital DEI in organizations.
  • Digital organizational strategies and practices associated with DEI.
  • New frameworks to describe and explain the phenomenon of digital DEI and the organization.
  • Roles and responsibilities of IS departments in developing and supporting digital DEI initiatives.
  • The use of technology for organizational DEI goals and objectives.
  • The dark side of digital DEI initiatives.
Minitrack Co-Chairs:

Ester Gonzalez (Primary Contact)
California State University, Fullerton
esgonzalez@fullerton.edu

Sam Zaza
Middle Tennessee State University
sam.zaza@mtsu.edu

Angsana Techatassanasoontorn
Auckland University of Technology
angsana@aut.ac.nz

The digital divide refers to the gap between those who have access to and use of digital technologies and those who do not. The divide and resulting inequities can take a number of forms. Despite significant progress in the adoption and use of information and communication technologies (ICTs), there is still a substantial gap in the levels of digital inclusion and equity between some members of vulnerable populations and other members of society.

Vulnerable populations, which may include but are not limited to youth, the elderly, persons with disabilities, low-income, rural and indigenous communities, marginalized castes, refugees, those who are stateless, and under-served regions in various developing and developed contexts, often face systemic barriers in accessing, adopting, and using ICT. This digital divide has significant social, economic, and political implications and further deepens the potential inequality and exclusion of these populations.

This call for papers invites original research papers, case studies, and review articles that investigate the digital divide and its impact on vulnerable populations, as well as initiatives that address these vulnerabilities, moving towards digital equity and inclusion. We encourage submissions that address, but are not limited to, the following topics:

  • Access to ICT: infrastructure, affordability, and availability
  • ICT adoption and use: barriers, opportunities, and challenges
  • Digital skills and literacy: training and education for vulnerable populations
  • Digital inclusion policies and strategies: best practices and lessons learned
  • Social and cultural factors: attitudes and perceptions towards ICT
  • Gender, race, caste, and ethnicity: intersectionality and the digital divide
  • ICT and health: the role of digital technologies in promoting health equity
  • ICT and education: the impact of the digital divide on learning outcomes
  • ICT and political participation: digital democracy and political engagement
  • ICT and economic development: the role of digital technologies in reducing poverty and inequality
  • ICT and social entrepreneurship: Role of microentrepreneurs and social entrepreneurs
  • Digital social innovation and digital social intermediation: the role of social intermediaries in leveraging ICTs to addressing SDGs
  • Unintended consequences as a result of ICT use or efforts to bridge the digital divide

We welcome interdisciplinary and comparative studies that employ a variety of methods, including but not limited to, qualitative and quantitative research, case studies, experiments, surveys, and mixed-methods approaches. We encourage submissions from both established and emerging scholars, including graduate students and practitioners.

Minitrack Co-Chairs:

Israr Qureshi (Primary Contact)
Australian National University
Israr.Qureshi@anu.edu.au

Carmen Leong
University of New South Wales
Carmen.leong@unsw.edu.au

Arlene Bailey
University of the West Indies
arlene.bailey@uwimona.edu.jm

K.D. Joshi
University of Nevada, Reno
kjoshi@unr.edu

The interplay of Gender and Technology is fundamental in understanding the role gender plays in marginalizing or empowering individuals in the technology space. Accelerating gender balance in technology is a social justice issue. Information Technology (IT) is powering and influencing all aspects of our lives. Therefore, the future will be shaped and controlled by people who know how to use, design, and build technology. Gender balance in the technology space is imperative to ensure that the future of work and life is not decided for individuals who are not well represented in this space. This minitrack is designed to give voice to such research to promote discourse and uncover deep and rich insight into the topic of gender in technology.

This minitrack seeks to attract research that conceptualizes, theorizes, and operationalizes the gender construct as a social identity and not just as a biological sex with a dichotomous category. In addition, we encourage the use of gender-based theories, such as the Individual Differences Theory of Gender and IT, Gender Role Theory to articulate the conceptualization of gender. This minitrack invites gender-focused analysis of societal, organizational, and individual factors that not only advance our understanding of how gender shapes the technology milieu but also reveal interventions that can help attenuate gender inequities and imbalance. Topics of interest include, but are not limited to:

  • Applying the Intersectionality perspective to advance gender analysis in IT research
  • Designing “Gender-free” technology
  • Feminist perspectives on gender and technology
  • Gender analysis of the history of technology
  • Gender analysis of the use and consumption of technology
  • Gender analysis of design and construction of technology
  • Gender attitudes toward technology
  • Gender biases and stereotypes in the technology industry
  • Gender, identity, and technology use
  • Gender imbalance in the technology field
  • Gender pay gap in the technology field
  • Gender role congruity and technology career pathways
  • Gendered nature of technology leadership
  • Gendered opportunities and risks of new technologies
  • Gendered patterns in the use of new technologies
  • Hegemonic masculinity in the technology industry
  • Imposter syndrome and women in technology
  • New approaches to conceptualizing and operationalizing gender and technology
  • Role of power in creating gender equity within the technology fields
  • Tech entrepreneurship and gender
  • Work-life balance in technology field
  • Understanding and removing barriers to STEM careers for women
  • Gendered roles and digital entrepreneurship
Minitrack Co-Chairs:

Regina Connolly (Primary Contact)
Dublin City University
regina.connolly@dcu.ie

Cliona McParland
Dublin City University
Cliona.McParland@dcu.ie

Mina Jafarijoo
Stockton University
Mina.Jafarijoo@stockton.edu

Information and communication technologies (ICT) have changed the practices used by illicit actors and those seeking to interdict illegal or exploitative activity, as well as leading to new business models and business practices that expand illicit actors’ markets, increasing the risk and scope of victimization, and allowing illicit actors to evade detection. ICT also enables illicit actors to gain access to marginalized groups, who are often already vulnerable to exploitation. Law enforcement agencies and governments react to these adaptations by illicit actors, often by trying to comply with or reform aging laws and policies that fail to keep up with technological advances and criminal behavior. In addition, many legitimate organizations seek opportunities to use ICT to identify and mitigate the use of their products or services by illicit actors to protect their stakeholders and organizations from harm or exploitation.

The focus of this minitrack is to open an evident space for research related to the use of ICTs in understanding and promoting criminal justice as well as protecting individuals in marginalized groups from illicit activities and actors. Criminal justice is an umbrella term that refers to the laws, procedures, institutions, and policies at play before, during, and after the commission of a crime. Criminal justice has two central ideas: Suspects, convicted criminals, and victims of crime all have certain rights.

This minitrack welcomes research exploring the intersection of ICT and illicit activity that has a physical world component and/or the use of ICT by illicit actors that target or exploit marginalized groups. We welcome a range of methodological approaches and conceptual, theoretical, empirical, and methodological papers. We are interested in research from a range of perspectives, such as how criminal behavior is altered due to ICTs, interventions by law enforcement, civil agencies, Nongovernment Organizations (NGO), or businesses to detect, disrupt, or dismantle illicit networks, and the role of IT to serve and support victims of crime and exploitation in gaining access to justice. Additionally, we are interested in papers that seek to understand how illicit actors leverage ICTs in illicit networks (e.g., criminal networks) to exploit several types of victims (i.e., firms, governments, individuals, and groups), especially marginalized ones.

This minitrack invites submissions of original work concerning the intersection of information systems research with criminal justice. The relevant topics for the minitrack include, but are not limited to, the following areas:

  • ICT and gun violence
  • The application of datafication and AI in criminal justice
  • AI and predictive policing
  • Big data and risk assessment
  • Facial recognition in criminal justice
  • Dataveillance, security, and privacy
  • Datafication and AI applications in border control
  • Generative AI-related scams and phishing attacks
  • Generated online hate for large-scale “hate-raids”
  • Social engineering attacks, such as using Generative AI voice models
  • Jail-breaking Generative AI to elicit harmful responses/escalate privileges
  • Generative AI biases against marginalized groups and/or specific groups (g., ethnicity and political)
  • Generative AI errors that exploit marginalized groups (g., hallucination reliance)
  • Generative AI use to exacerbate polarization (g., synthetic media)
  • Illegal content generation (g., CSAM and NCII)
  • Attacks against Generative AI
Minitrack Co-Chairs:

Nishant Vishwamitra (Primary Contact)
University of Texas at San Antonio
nishant.vishwamitra@utsa.edu

Daniel Pienta
University of Tennessee, Knoxville
dpienta@utk.edu

Kim-Kwang Raymond Choo
University of Texas at San Antonio
raymond.choo@fulbrightmail.org

This minitrack provides a forum for open and vibrant discussion for research related to the use of Information and Communication Technologies (ICT) in understanding and promoting social justice. Social justice has become particularly relevant to information systems (IS) researchers in light of the emergence of increasingly autonomous and sophisticated ICT, including biometrics and deep learning-based AI systems.

Social justice is the belief that everyone deserves fair and equal treatment and serves as a theoretical grounding for burgeoning research related to the oppressive and dehumanizing nature of modern ICT. Such technologies are developed and deployed based on the mass acquisition and curation of human-centric data, in some cases without consent from individuals, which serve as an affront to human dignity and very essence of being humans in a just society. Thus, ICT and Social justice research refer to studies about actions that promote equal rights, equal opportunities, and equal treatment between individuals, organizations, and the technologies themselves, as well as studies that highlight the use of ICT to uncover social injustice.

The guiding principles of social justice are human rights, access to basic elements such as food, water, shelter, safety, education, and opportunities, equal participation in decision-making, and equity to reduce systemic barriers to ensure every individual is treated fairly and equitably.

So why is social justice part of our remit as IS researchers? Walsham (2005) says that ICTs are involved in the way that we as individuals carry out our work and leisure activities, in the way that we organize ourselves in groups, in the forms that our organizations take, in the types of societies we create, and thus in the future of the world. ICTs are therefore deeply implicated in social justice, as IS inscribe our understanding of the world, and our attendant prejudices. Emergent ICT such as biometrics and modern AI systems are often, by design, developed through the collection and extraction of increasing amounts of human data, and in turn can unilaterally shape our perceptions of the world, and thus pose imminent and existential threats to social justice and humanity.

This minitrack invites submissions of original work concerning the intersection of IS research with social justice. Studies about the uses of ICT to uncover inequalities and injustice, and to promote justice at all levels (i.e. racial, climate, age, etc.) and equality and equity for those with fewer privileges such as people of color (POC), refugees and asylum seekers, unhoused, and people with disabilities. We also welcome critical approaches to these topics. Our goal is to spur discussion through research explorations that can enhance understanding and enliven new opportunities to derive novel ways of preserving and improving individual and societal well-being. The relevant topics for the minitrack include, but are not limited to, the following areas:

  • ICT and social inclusion
  • ICT and racial injustice
  • ICT and equality and equity
  • ICT and climate justice
  • ICT and voting rights.
  • ICT and income gap
  • ICT and ageism
  • ICT and individuality, humanness, and human dignity
  • Feminist perspectives in data justice
Minitrack Co-Chairs:

Jan Kietzmann (Primary Contact)
University of Victoria
jkietzma@uvic.ca

Andrew Park
University of Victoria
apark1@uvic.ca

Introduced in 2008 by Satoshi Nakamoto, Bitcoin marked the beginning of the first peer-to-peer currency, igniting a recent wave of interest in Blockchain, Cryptocurrency, and FinTech. Despite significant interest in Blockchain, Cryptocurrency, and FinTech, more than a decade after their emergence, they have yet to become everyday technologies for consumers. These technologies still face numerous technical challenges, including scalability, security, privacy, interoperability, and energy consumption (Vasiljeva et al., 2016), along with challenges in business adoption, social trust, ethical, environmental, and regulatory controversies, and the potential for illicit activities.

For marginalized contexts, such as in developing economies, the challenges of integrating Blockchain, Cryptocurrency, and FinTech are also substantial. Issues such as the digital divide, lack of infrastructure, regulatory uncertainties, and the need for education and digital literacy can impede the adoption and effective utilization of these technologies.

This minitrack welcomes researchers from all disciplines, ranging from the ‘hard sciences’ such as engineering and computer science, to social sciences, finance, management, and beyond. We invite submissions that employ a variety of methodologies, including but not limited to algorithm/system design, experiments, simulation, theoretical analysis, empirical research, surveys, design science, development of theoretical frameworks, qualitative inquiries, and case studies. Our goal is to cultivate a vibrant dialogue across a wide range of methodological perspectives, thereby advancing understanding and fostering innovation in this field. Our scope of interest spans a wide range of topics, including, but not limited to:

Technical Aspects:

  • The responsible applications of AI and Machine Learning in Blockchain, Cryptocurrency, and FinTech.
  • Development of responsible Algorithms, Protocols, and Consensus Mechanisms.
  • Scalability, Security, Decentralization, Interoperability, Transparency, Accountability, and Standardization in Blockchain, Cryptocurrency, and FinTech.
  • Quantum Computing and Cryptography in Blockchain, Cryptocurrency, and FinTech.
  • Open-source Development in Blockchain, Cryptocurrency, and FinTech.

Social Aspects:

  • Responsible implementation of Blockchain, Cryptocurrency, and FinTech in marginalized contexts and developing economies.
  • Bridging the Digital Divide and promoting Financial Inclusion and Social Justice through Blockchain, Cryptocurrency, and FinTech.
  • Enhancing Education and Financial Literacy within these domains.
  • Supporting Small and Medium-sized Enterprises (SMEs) with Blockchain, Cryptocurrency, and FinTech.
  • The Future of Work in the era of Blockchain, Cryptocurrency, and FinTech.

Business and Economic Aspects:

  • The adoption of Blockchain, Cryptocurrency, and FinTech.
  • Central Bank Digital Currencies (CBDCs).
  • Microfinance and Crowdfunding through Blockchain, Cryptocurrency, and FinTech.
  • Regulation and Governance in the Blockchain, Cryptocurrency, and FinTech sectors.
  • The Geopolitical Landscape and Cross-border Applications of Blockchain, Cryptocurrency, and FinTech

Environmental and Ethical Aspects:

  • Energy Efficiency and Sustainability in Blockchain, Cryptocurrency, and FinTech
  • Combating Illicit Activities with Blockchain, Cryptocurrency, and FinTech
  • Ethics and Data Privacy in Blockchain, Cryptocurrency, and FinTech
Minitrack Co-Chairs:

Yibai Li (Primary Contact)
University of Scranton
yibai.li@scranton.edu

Kaiguo Zhou
Capital University of Economics and Business
zhoukg@cueb.edu.cn

Wanli Liu
Guangzhou Xinhua University
liuwlariel@xhsysu.edu.cn

STEM fields offer numerous exciting and lucrative career opportunities, but unfortunately, these fields are often characterized by a lack of diversity and inclusivity. This mini track will focus on addressing barriers to equity and social justice in STEM education and careers, with a particular emphasis on underserved populations.

The minitrack will explore new angles and approaches to promoting equity and social justice in STEM education and careers, including the following topics:

  • Cultivating interest and fostering access: To promote equity in STEM education, it is important to cultivate interest in these fields among underserved populations, and to provide access to high-quality STEM education. This session will explore innovative programs and initiatives that are designed to introduce underserved populations to STEM fields and provide them with the tools and resources they need to succeed.
  • Implementing inclusive pedagogical and curricular innovations and practices in STEM education: This session aims to provide a forum for scholars and practitioners to share their experiences and insights on how to create a more inclusive and equitable learning environment in STEM education. We encourage submissions from researchers, educators, and practitioners from a variety of disciplinary backgrounds, as well as those who work with learners of all ages, from K-12 to postsecondary education.
  • Addressing systemic barriers: Despite progress in promoting diversity and inclusivity in STEM fields, there are still many systemic barriers that prevent underserved populations from achieving success. This session will explore these barriers and discuss strategies for addressing them, including policy changes, community outreach, and mentorship programs.
  • Advancing opportunities: In order to promote equity in STEM fields, it is important to create opportunities for underserved populations to succeed. This session will explore innovative approaches to advancing opportunities for underserved populations in STEM careers, including internship and apprenticeship programs, career development workshops, and entrepreneurship initiatives.
  • Amplifying diverse voices: It is essential to amplify the voices of diverse individuals in STEM fields, including those from underserved populations. This session will explore the importance of diversity in STEM fields and highlight successful initiatives that have increased representation and promoted inclusivity.
  • Engaging industry partners: Industry partners can play a crucial role in promoting equity and social justice in STEM education and careers. This session will explore partnerships between academic institutions, community organizations, and industry partners to create meaningful opportunities for underserved populations in STEM fields.
  • Data and Assessment: Using data and assessment to track progress and ensure accountability for promoting equity and social justice in STEM education is key. This session will explore best practices for collecting, analyzing, and using data to inform decision-making and measure the effectiveness of initiatives and interventions.

Overall, this minitrack will provide a unique opportunity to explore innovative approaches to promoting equity and social justice in STEM education and careers, with a focus on underserved populations. Participants will gain insights into successful initiatives, discuss best practices, and build partnerships to advance these important goals.

Minitrack Co-Chairs:

Curtis Cain (Primary Contact)
Howard University
caincc@howard.edu

Asli Yagmur Akbulut
Grand Valley State University
akbuluta@gvsu.edu

Benyawarath Yaa Nithithanatchinnapat
Penn State Behrend
benyawarath@psu.edu

This minitrack explores the transformative impact of digital technologies on crucial sectors, such as healthcare and agriculture, with a strong emphasis on fostering equitable development. It also discusses the driving forces behind digital transformation, its effects on marginalized communities, and the ethical considerations inherent in technology adoption. Key themes include digital health advancements and investigating the influence of telemedicine, AI, and wearable technologies on healthcare accessibility, particularly for vulnerable populations. Discussions on the track will dissect issues of inclusivity, telehealth’s significance in mental healthcare, and the ethical implications of AI in healthcare. Additionally, the minitrack will analyze the implications of precision farming, climate-smart technologies, and digital decision-making tools in agriculture, evaluating their impact on smallholder farmers, food security, and the promotion of sustainable agricultural practices. Engaging discussions will also address socioeconomic perspectives, including bridging the ‘digital divide,’ the dual role of technology in exacerbating or mitigating inequality, and the influence of digital platforms on political culture.

We encourage submissions that address, but are not limited to, the following topics:

  • Digital transformation as a driver of socioeconomic development.
  • Analyzing the drivers of digital transformation in organizations.
  • The role of digital platforms in shaping political participation and culture.
  • The dual role of IT in creating and mitigating inequality.
  • The impact of digital transformation on healthcare accessibility for marginalized and vulnerable communities.
  • Bridging the digital divide in healthcare
  • Ethical considerations in the implementation of AI in healthcare
  • Diversity and inclusivity in information systems design
  • Telemedicine’s role in bridging or widening health disparities
  • Telehealth services in mental health care for marginalized groups
  • Impact of wearable health technologies on elderly care
  • Role of AI in personalized medicine for underrepresented populations
  • Agricultural technology for better decision making
  • Assessing the impact of climate smart agricultural technologies in vulnerable communities
  • Digital agricultural practices and food security in marginalized communities
  • Impact of precision farming on smallholder farmers in developing countries
  • Digital training and capacity building for sustainable agriculture
Minitrack Co-Chairs:

Arif Perdana (Primary Contact)
Monash University
Arif.Perdana@monash.edu

Juliana Sutanto
Monash University
Juliana.Sutanto@monash.edu

Misita Anwar
Swinburne University of Technology
misitaanwar@swin.edu.au

Intan Azura Mokhtar
Singapore Institute of Technology
Intan.Mokhtar@SingaporeTech.edu.sg

Information and communication technologies (ICT) empower underserved and vulnerable populations through increasing access to, for instance, health, education, work, and social and political participation. Yet, simultaneously, ‘configuring the user as everybody is an inadequate strategy to account for the diversity of users.’ That is, ICT designed for everyone risks excluding marginalized communities, failing to account for the diverse values of distinct groups and in different contexts, and exacerbating persistent societal biases by design. To address these concerns in ICT design and development, value sensitive design (VSD) presents an approach to ICT design that accounts for human values by design. VSD offers methods to investigate, understand, and account for direct and indirect stakeholder values and explore value tensions and value-adds in the socio-technical context intentionally and comprehensively. It has been applied in research with various marginalized communities, including LGBTIQ+ older adults, Indigenous populations, homeless young people, and people with disability.

This call for papers invites submissions that address VSD and marginalized communities from scholars across all disciplines. It welcomes empirical, theoretical, or position papers. Potential topics include, but are not limited to:

  • Human values
  • Artificial intelligence values
  • Value tensions and imbalances
  • Inclusive design
  • Co-design
  • Design for value change and adaption
  • Equity, inclusion, diversity, and empowerment
  • Inequity and exclusion
  • (In)equality in technology design
  • Privacy, autonomy, trust, and freedom from bias
  • Accessibility, justice, fairness, and responsibility
  • Environmental sustainability
  • Human welfare and human rights
  • Social inclusion, isolation, and socially disruptive technologies
Minitrack Co-Chairs:

Adam Poulsen (Primary Contact)
University of Sydney
adam.poulsen@sydney.edu.au

Oliver Burmeister
Charles Sturt University
oburmeister@csu.edu.au

Janet Davis
Whitman College
davisj@whitman.edu

David Hendry
University of Washington
dhendry@uw.edu