Spring 2024 AIIDE TLC Deliverables

People at table

Our Spring 2024 AIIDE TLC cohort was tasked with creating deliverables for the topics covered over the 14-week class period. The topics were Inclusive Assessment, Unconscious Bias in the Classroom, Mitigating Financial Barriers with Open-Source AI in the College Classroom, Mitigating Racial and Cultural Biases in GenAI tools. Click on each topic to learn more.

Goals
  • To become informed about alternative ways to assess the student(s) in your classroom (studio, lab)
  • To gain one or two practical assessments (assessment skills) you can apply this semester
  • To consider if our assessments are actually measuring the skills/learning/competency we want the student(s) to demonstrate
  • To consider if we have given the student sufficient opportunities to demonstrate their learning/skills/competency

 

Resources

 

Questions for the Instructor
  • Who is in my class?
  • What are the skills/competencies I want my students to demonstrate by the end of the semester?
  • Am I grading behavior (for example, lateness) or am I grading competency/demonstration of the skills needed to be learned
  • How much do I value growth and learning and/or are there certain benchmarks that need to be met?

 

Facts/Things to Consider
  • Students learn and demonstrate knowledge in different ways
  • Students bring different cultural norms and practices to the classroom
  • Some students are more versed in traditional assessment methods, academic norms, and behaviors than others
  • A 2016 initiative to implement more inclusive grading at a high school led to a reduction in the grade gap for low income students, and an improvement in the correlation between student course grades and standardized tests. (Source: Feldman)

 

Strategies

In order to assess inclusively, we might:

  • Take time to articulate what it is you really want to assess
    • Write down the five most important skills/learning objectives in your class
    • Consider whether your assessment practices are actually measuring these learning objectives
  • Include diverse modes that work/ skill/ learning can be demonstrated
  • Offer student choice in method/form that is used to demonstrate learning
  • Offer multiple opportunities for students to succeed at demonstrating skills/learning

 

Examples

  • Student can choose to take one test at end, or multiple tests along the way
  • Student can submit work in written form, as video or visual drawing, as oral podcast, or as live sharing in class
  • Offer numerous weekly quizzes on small amounts of material to provide opportunities for success
  • Offer opportunities to revise or resubmit, so that what becomes important is that the student has gained the skill or demonstrated the learning
  • Invite student research as a part of the development of diverse resources included on a particular subject area
  • Ask for a range of different kinds of demonstration of learning, i.e. individual, partner, small group
  • Include material and examples in class and on tests that represent diversity and the cultures and identities that exist in your student body
  • Include peer to peer reflection and feedback as part of overall assessment
  • Define clearly what a student needs to be able to demonstrate in order to ‘get an A’, and if they can demonstrate that, give them an A

 

Strategies for Inclusive Grading from Grading for Equity by Joe Feldman

Strategy: Avoid giving 0’s that overweight missed assignments. Avoid 0’s for completely wrong answers.

  • Problem: Giving a 0 overweights the failure on a (90/80/70/60) scale, and that can dig an early hole that a student can’t climb out of
  • Suggestion 1: Require all work to be completed. Student receives an incomplete until everything is in. (No late penalties.)
  • Suggestion 2: Make 50% be the lowest grade (if using 90/80/70/60 scale)
  • Suggestion 3: Grade on a 4=A, 3=B, 2=C, 1=D, 0=F scale

Strategy: Use something other than a straight average of assignment grades to create the overall grade.

  • Problem 1: A student performs consistently well but has one really bad assignment that pulls their grade down.
  • Problem 2: Consider a student who struggles early but then “gets it” versus a student who came into the course already knowing the material. If we give assignments equal weight then the student who struggles with the material early in the course will end up with a much lower grade than the student who started already knowing the material.
  • Suggestion 1: Use median score instead of the mean score to prevent a couple poor assignments pulling down overall good work.
  • Suggestion 2: Weight later work higher than earlier work. Let students know they will be evaluated on their performance at the end of the course.

Grade individual achievement within group projects rather than a single grade for a group.

Don’t give extra credit.

  • Problem: Extra credit gives the impression that the course is about acquiring points more than mastering material, it suggests that some material is less important because it can be replaced by extra credit, and it is not equitable for students without the resources (ex: time) to complete.
  • Suggestion: “If the work is important, require it; if it is not, don’t include it in the grade.”

Allow late work at no penalty (up to a limit).

  • Problem: Late penalties means that grades no longer accurately reflect student learning, and late penalties more likely harm students with fewer resources or other external hardships.
  • Suggestion: Provide highly recommended dates when work should be completed, but allow late work up to the time of the summative assessment on that work or some other time when completing the work no longer has educational value.

Don’t grade homework (formative assessments) but use only summative assessments for course grade.

  • Problem: Homework is typically a time when students struggle with applying the course concepts and make mistakes. Students should not be penalized for working to learn the material.
  • Suggestion: Homework (and other formative assessments) should be used to provide feedback to students and assess how the class overall is managing the material. It should be recorded but with a weight of 0% in the course grade. Give 100% weight to summative assessments. 

Students who perform poorly get a retake or redo, but it should be required rather than optional.

Use rubrics.

Emphasize soft-skills that lead to improved performance without assigning grades to those skills. 

Learning Objectives
  1. Identify motivations for using open-source AI for higher education to mitigate financial barriers.
  2. Identify associated ethical considerations.
  3. Identify the capabilities and limitations of open-source AI tools.

 

Audience

This guide is intended for those with little to no experience using AI tools.

 

What is Generative AI?

Generative AI has gained a lot of attention in recent years due to its ability to create content that can be remarkably similar to what humans produce. It is particularly impressive because it can generate new data that wasn't explicitly in its training dataset.

There are various applications of generative AI across different fields, including:

  • Natural Language Generation: Can create human-like text based on the input it receives. It can be used to generate news articles, stories, essays or even whole novels.
  • Image Generation: Can create realistic images imitating different artistic styles. It can create images even from textual (prompt) descriptions or generate entirely new images based on patterns.
  • Music Generation: Can compose music, imitating different styles and genres, and even create new compositions.
  • Video Generation: Can generate video content, including deep fake videos. where it can manipulate and alter existing videos or create new ones.
  • Design and Art: Can assist designers by generating new designs, logos or art pieces based on specific criteria or styles.

While generative AI has shown remarkable capabilities, it also raises ethical concerns, especially around issues like deep fakes, copyright infringement, and data privacy. As this technology continues to evolve, it's important to create some guidelines to ensure responsible and ethical use.

It's also worth noting that while generative AI can produce impressive results, it's not always perfect and can sometimes generate misleading or incorrect information. Therefore, it's crucial to approach the content generated by AI with a critical mind and verify the information when needed. 

Perhaps the most popular Natural Language Generator today is ChatGPT, however for accessing its advanced AI tools, a subscription fee is charged. The subscription-based model can create an inequality gap, especially in educational settings. 

When essential educational resources are gated behind paywalls, it can hinder the learning experience for students who cannot afford these subscriptions. This can lead to disparities in educational outcomes and opportunities, further widening the socioeconomic divide.

To address this issue, several potential solutions that can be considered, but in this piece we would like to argue in favor of the use and support for open source and free alternatives. It is worth clarifying that Open-Source refers to code that is designed to be publicly accessible—anyone can see, modify, and distribute the code as they see fit. (Contrary to what the name suggests, Open-AI, the company behind Chat-GPT, does not release open-source code.) Encouraging and supporting the use of open-source AI tools can provide students with access to essential resources without the need for expensive subscriptions. Many open-source AI projects offer robust capabilities and can be a viable alternative to premium tools.

 

What Competent Free Alternatives Are There?

By implementing, encouraging, and supporting AI free tools, we can work towards creating a more inclusive and equitable educational environment where all students have the opportunity to explore and learn from the transformative power of AI, regardless of their financial circumstances.

We have identified some great free alternatives to ChatGPT to be used in and out the classroom. Our favorite ones are: 

  • https://gpt4all.io/index.html 
    This software can be installed locally, eliminating concerns about private data being captured and shared by companies with a vested interest in it. Additionally, it doesn't require a stable internet connection; users can even use it without an internet connection at all. It is essentially the same as Chat-GPT4 but without requiring a subscription payment.
  • https://huggingface.co/chat/
    This is a free alternative from one of the most reputable companies in the AI universe. However, it requires an internet connection and data is collected.
  • https://github.com/ggerganov/llama.cpp?tab=readme-ov-file
    This one requires some level of computer programming familiarity, but the instructions to install locally are relatively straightforward. We consider this a better option than the first one, however due to the higher technical skills required to work with it, we believe it is not suitable for everyone.

 

Financial and Ethical Considerations

Many educators considering bringing AI tools into their classrooms are concerned about their ethical implications.  Here we focus specifically on comparing the challenges and opportunities associated with open-source AI tools.  

One of the biggest advantages of open-source AI tools is that they are free to all users; there are no premium subscription packages for those willing to pay, so no additional financial investment is needed beyond an electronic device with internet access.  A recent study by JISC found that access to the most common commercially-available AI tools would cost a student more than $100 per month, so using open source alternatives could remove a significant financial barrier for students.  Some open source tools, such as GPT4All or Llama, can be downloaded and run from a local machine, so a student with internet access at school or a public library could download it there and use it from home even without internet access.

Another advantage of open-source tools is that the code is fully accessible to the users.  This makes the inner workings and associated biases potentially more transparent, though detailed investigation is needed to understand the particular biases associated with any given tool.  Similarly, ethical considerations about privacy and data security could vary between tools but are generally similar between commercial and open-source tools.  However, open-source tools designed to be run offline on a local machine also provide additional privacy and data security as no user data is reported to the code developers.

Those using open-source AI tools for the first time may wonder about the licensing of the tools, and whether classroom use cases would violate any copyright policies.  While the licensing can vary between different tools, many of the most popular ones and all three discussed above can be freely used and modified for educational or research purposes. 
 

Resources

Artificial Intelligence (AI) and, in particular, Generative AI are being used increasingly to generate diverse types of content such as text, images, and videos. This content is being made public on social networks, in literature, in the media, and even in scientific papers. It is essential to realize that generative algorithms are biased in diverse ways to mitigate and avoid bias perpetuation. Currently, algorithms have been trained on consciously/unconsciously-biased human inputs. As time passes and more AI-generated content becomes available, algorithms will be trained on this AI-generated content as the input, which will cause these biases to grow exponentially. Breaking the loop as soon as possible will be a crucial point to avoid bias perpetuation. Following, different types of bias and some combat strategies are described:
 

Bias in Data:

AI systems learn from historical data and reflect societal biases. If training data is biased, the AI model will learn and replicate these biases.

Combat Strategies: 

  • Collect diverse and representative datasets to mitigate biases. Include inclusive and diverse perspectives.
  • Use techniques like data preprocessing, bias detection, and algorithmic auditing to identify and mitigate biases in datasets and models.
  • Publish large-scale open-source diverse datasets.

Resource: "ViSAGe: A Global-Scale Analysis of Visual Stereotypes in Text-to-Image Generation," Jha, A. et al.

  • Text-to-Image model generations can reflect social stereotypes present in the real world.
  • The authors provide a dataset full of visual stereotypes and present a method for evaluating Text-to-Image models with a focus on "regional stereotypes."
  • Stereotypical attributes are thrice as likely to be present in generated images of corresponding identities as compared to other attributes, and the offensiveness of these depictions is especially higher for identities from Africa, South Africa, and Southeast Asia.

 

Algorithmic Bias:

AI algorithms perpetuate biases present in data, i.e., facial recognition systems are less accurate for certain races.

Resource: "AI and Algorithmic Bias: the issues with technology reflecting history and humans," Jackson, M.

  • Two examples from this resource: Insufficient training images of people with darker skin - African American people identified as gorillas in image recognition; No-touch soap dispensers were poorly trained to recognize darker skin colors - only worked with white hands.
  • An algorithm must possess:
    1. Finiteness (end after a set number of steps)
    2. Definiteness (focus on clarity of instructions)
    3. Input (transform something through step-by-step instructions)
    4. Output (end result)
    5. Effectiveness (steps are completed in a set amount of time)
  • Artificial Intelligence (AI) means understanding humans' thought processes and replicating their behavior in machines. This requires large amounts of data and fast, iterative algorithm processing.
  • Algorithmic bias: Because humans have racial and gender biases, and humans create the algorithmic foundation, consciously or unconsciously, these biases seep into algorithms. Then, AI systems replicate these biases in their decision-making processes.

Combat Strategies:

  • Regularly evaluate and audit AI algorithms to identify and mitigate biases. This could involve testing algorithms across different demographic groups to ensure fairness.
  • Employ techniques like adversarial training and fairness-aware learning to train models that are more robust to biases.

 

Lack of Diversity in AI Development:

Lack of diversity of people working in AI can result in blind spots in understanding and addressing issues related to race and bias.

Combat Strategies:

  • Promote diversity and inclusion in AI development teams. Encouraging participation from individuals from diverse racial and ethnic backgrounds can bring different perspectives and insights.
  • Support initiatives and organizations that aim to increase diversity in AI, like scholarships, mentorship programs, and inclusive hiring practices.

 

Transparency and Accountability:

Many AI systems operate as "black boxes," making it difficult to understand how they make decisions and whether they exhibit biases.

Combat Strategies:

  • Prioritize ethical considerations in AI development and deployment. Consider the potential impacts on different racial and ethnic groups and design systems that promote fairness, justice, and equity.
  • Engage with communities that will be directly affected by AI technology to ensure their perspectives and concerns will be taken into account.

 

Interacting with AI Systems:

AI systems are input-output systems; you will get out what you put in.

Combat Strategies:

  • Educators should discuss with students about responsible and ethical interaction with AI systems.
  • Do not prompt Chat GPT to provide inappropriate responses. Each new version provides new patches, but it is not comprehensive.
  • Universities that use systems like Co-Pilot should not share information back with GenAI so as not to reinforce Western European norms, which creates unfair bias because other universities do not have access to these resources.
  • Educators should teach students to critically examine results they receive and consider how the historical and societal biases that trained these systems influenced the results it provides.

 

Complementary Bibliography
Objectives:
  • To define unconscious bias, provide guidance to course directors/faculty (target populations) on how to recognize their own unconscious bias in a classroom setting and approaches to address it
  • To provide multimedia resources to accessible articles, videos, and trainings that can be utilized to gain more information, examples, and strategies

 

Unveiling Unconscious Bias in the Classroom: Understand, Recognize, Address

 

Understand Unconscious Bias

Unconscious bias, also called implicit bias, is when we make decisions, assumptions, or show preferences for others without realizing it. Broadly, unconscious bias can include stereotyping (perception bias), only seeking or receiving information that supports what you already think (confirmation bias), having preferences for those who have similar characteristics (affinity bias), or attributing good or bad qualities about someone without getting to know them (halo effect). Everyone has some form of unconscious bias and regardless of even the best of intentions, there can be a significant impact on others. In the classroom, these biases contribute to how instructors perceive and treat their students. When unaddressed, the impact results in poorer mental health outcomes, academic performance, or overall success of students.

Graphics and short descriptions of different biases addressed in above paragraph

 

 

 

 

 

Recognize Your Own Unconscious Bias
  1. Become aware of and reflect on your unconscious biases. A variety of tools have been developed to help illuminate where you might have unconscious bias, including Project Implicit, which has a variety of domains in which to test yourself.
  2. Intentionally engage with people of different backgrounds. This can be uncomfortable and requires you to embrace feelings of discomfort, which can be a good source of learning.
  3. Recognize and accept that everyone, including you, your colleagues, and your students, have unconscious biases and that these biases do not speak to the quality of a person as an individual.
  4. Foster a growth mindset. It is critical to understand that becoming aware of and addressing your conscious bias will require continuous reflection and intentionality.

 

Address Unconscious Bias in the Classroom
  1. Set the tone for a more inclusive classroom and allow for time to get to know your students and for your students to get to know each other. Doing this early on can help increase the sense of belonging for students.
  2. Practice equity in the classroom. Examples of this would be to hold every student to high expectations while ensuring each student's individual needs are being met and learning styles accommodated; use transparent grading guidelines; avoid making assumptions about your students; do not ask students of color to be an expert on their race or ethnicity; be aware of religious holidays; and be mindful of your use of technology and students with physical disabilities.
  3. Diversify your curriculum to include a broader range of exposures to individuals contributing to or working in your field.

 

Resources:
  1. Informational videos about unconscious bias: Center for Urban Education
  2. University of California, Los Angeles' Equity, Diversity & Inclusion's Implicit Bias page