2023-2024 Collaboratories

Join one of our Collaboratories and become an Ahmanson Lab Innovation Scholar!

The Harman Academy is seeking creative and dedicated students to join us at the Ahmanson Lab as Innovation Scholars for 2023-2024. Ahmanson Lab Innovation Scholars will work with faculty, experts, artists, and designers in one of two team-based Collaboratories over the 2023-2024 academic year. They will also receive a $1000 stipend and have 24/7 access to technology and resources at the Lab.
 
Apply to one of the Collaboratories below to become an Innovation Scholar!

See our past Collaboratories here.
 
Applications are due by Thursday, August 31, 2023.

  • reclaim project thumbnail

    The Reclaim Project

    We are seeking motivated and creative students committed to fostering civic discourse, democratic ideals, and a diverse and equitable public sphere. Students will join a large international grant-funded project that is (1) examining the impacts of misinformation and disinformation on democracy and (2) promoting alternative media making to counter these impacts.

    Students will work with an Ahmanson Lab-based research team, the Reclaim Project, to form partnerships to foster democratic dialogue. The team will create compelling original content that will counter misinformation in spaces like TikTok, Twitch, and YouTube and will also connect to and learn from the larger international team of researchers. The team will merge creative media production with scholarly research on misinformation, online hate, and social networks. We will develop strategies and templates for the production and circulation of content and connect with a variety of partners.

    We’re looking for applicants who are skilled communicators and can perform outreach to certain outlets and individuals of interest for this project.

    The ideal applicant will have some of the skills noted below and the ability to work both independently and in a team setting. Most importantly, you will be guided by a strong desire to build a more just world.

    We welcome applications from both undergraduate and graduate students.

    >Social Media

    Students who possess an understanding of social media platforms and their operations, as well as the ability to identify and comprehend social media trends. Experience with TikTok, Twitch and/or YouTube is a plus.

    >Media Production

    Students with a background in content production. Experience crafting content for social media is required. Prior work in videography and photography is a plus, as is experience in post-production and editing.

    >Design and Data Communication

    Students skilled at generating creative briefs including pitch decks. Candidates with graphic design and relevant data skills are preferred, as are those familiar with software for photo and video editing.

    >Research

    Students with experience and/or an interest in researching disinformation as a phenomenon, namely its parameters, spread, reach, and implications, are invited to apply. Candidates with an ability to synthesize complex information and navigate databases are preferred.
    Apply to this Collaboratory
  • xx

    Bots vs. Ballots: Generative AI, Disinformation, and the 2024 Election

    The Ahmanson Lab seeks creative and engaged students to join an interdisciplinary research team of faculty and undergraduates to explore potential misuse and abuse cases for generative AI in the 2024 presidential election. Under the guidance of faculty and other project leaders, students will conceptualize and research specific instances of potential AI-generated disinformation campaigns in 2024, and then model, test and evaluate their scenarios. Students do not need to have any prior expertise with AI. Indeed, our hope is to assemble a team of students with diverse perspectives and backgrounds.

    Generative AI poses a significant threat to future U.S. elections. Political campaigns and individual bad actors can and will use generative AI to disinform, divide, and bewilder the voting public at a scale previously unseen. They will spread disinformation by, among other means, producing fake images and video, generating inordinate quantities of deceptive text, and building conversational social media bots aimed at radicalization. To further complicate matters, these technologies will be deployed at a time when the United States faces unparalleled division and a climate of mistrust and skepticism, even on the matter of election integrity itself. As the 2024 presidential election comes into focus, it is crucial for policymakers, AI developers, and others to take steps to combat these risks: this includes forecasting and testing the specific ways in which these technologies may be deployed to these ends.
    Apply to this Collaboratory