Final Projects/Audience Impact

Dr. Nathaniel Cline

Agenda

1

Project Schedules

2

Audience Impact

3

Studio Time

Class Schedule

11/13: Audience impact

11/15: Enhancing graphics

11/20: Class wrap up

11/27: Presentations

11/30: Presentations

Project presentations

  • Your final project needs to be live by the presentations (11/27). You can negotiate with me what this means.

  • The presentations should ideally have visual representation in slides

  • You have 5 minutes to present, and then time for questions.

Project write-up

Along with your slides for your presentation, you will submit a final write up of:

  • A 300-500(ish) word write up on how you found your data, cleaned it, analyzed it, and synthesized it into a story. You need to be specific in your description of the data.

  • A 300-500(ish) word “impact” write up that includes:

    • A set of goals that explain why you want to tell them your data story
    • A stated and well-defined audience for your data story
    • A write-up of your audience assessment strategy

Grading

For both Johnston and non-Johnston students there are two pieces that matter for my assessment of your performance:

  1. Your completion of in-class (“data sketches”) and out of class (“homeworks”) work. These are P/F.

  2. Your group project:

    1. Did you complete all steps of the project?
    2. Did you complete write up and presentation?
    3. Assessment of the project itself.
    4. Assessment of your contribution to the group.

Non-Johnston Grading

For non-Johnston students, our syllabus states that your grade has a roughly 50/50 weighting:

  • 50% for in-class and out of class work
  • 50% for projects

Non-Johnston Grading Breakdown

  1. Did you complete all steps of the project (did you submit what I asked on Forms)? P/F

  2. Did you complete the write up and presentation? A/B/F

  3. Assessment of the project itself (see next slide)

  4. Assessment of your contribution to the group. A/B/F

Project Assessment

Criteria Excellent (4) Good (3) Satisfactory (2) Needs Improvement (1)
1. Appropriate Data The chosen data is excellent, demonstrating a clear understanding and relevance to the project goals. - Thoroughly cleaned and prepared for analysis. The chosen data is good, with a clear understanding and relevance to the project goals. - Adequately cleaned and prepared for analysis. The chosen data is satisfactory, but there may be some gaps in understanding or relevance. - Cleaning is basic and may have minor issues. The chosen data is weak, with limited understanding or relevance to the project goals. - Cleaning is inadequate or incomplete.
2. Clear Data Story The data story is exceptionally clear, insightful, and effectively communicates the chosen narrative. The data story is clear, insightful, and effectively communicates the chosen narrative. The data story is clear but may lack depth or originality in insights. The data story is somewhat unclear or lacks substantial insights.
3. Connection between Story and Method The method of communication demonstrates a seamless and logical connection to the data story. The method of communication has a clear and logical connection to the data story. The method of communication has some connection to the data story but may lack clarity. The method of communication has a weak or unclear connection to the data story.
4. Execution of Communication Method The chosen method is executed exceptionally well, adhering to principles of aesthetics and perception discussed. The chosen method is well-executed, following principles of aesthetics and perception. The execution of the chosen method is satisfactory but may have minor flaws. The execution of the chosen method is weak, with noticeable flaws in aesthetics or perception principles.
5. Clearly Defined Audience The audience is clearly defined, and the project effectively targets them with precision. The audience is well-defined, and the project appropriately targets them. The audience is mentioned but may not be fully developed, and targeting is somewhat vague. The audience is unclear, or targeting is not well-aligned with the project goals.
6. Assessment Design The assessment design is comprehensive, effectively gauging audience engagement and understanding. The assessment design is good, providing valuable insights into audience engagement. The assessment design is basic but provides some insights into audience engagement. The assessment design is ineffective or lacks relevance to the project goals.

Edits?

How would you edit the project assessment rubric? What would you add/remove?

Physical ideas

  • Feedback wall

  • Pledge or feedback tree

  • Poll

  • Comment cards

  • Vox pop – a quick question or two asked by a person (or online) to gather data.

Online

  • Pre/post quiz or feedback

  • Click analysis

  • Action/donation numbers

Both

  • Focus group

  • Pre and post interviews

Examples

BeeSmart video focus group:

Before questions:

  1. Which of these factors do you think affects food production the most?

    1. Drought
    2. Bee colony collapse
    3. Global population increase
    4. Crop diseases
    5. GMO

Where do you think most honeybees in the US live? (Select state)

After questions:

  1. After watching this video, does your answer to question 1 change?

    1. Drought

    2. Bee colony collapse

    3. Global population increase

    4. Crop diseases

    5. GMO

  2. Did you learn something new about honeybees in America?

  3. After watching this video, if someone asked you to sign the petition would you be more willing to do so?

Examples

Affluent white Bostonians article focus group:

Pre-question:

  • “On a scale of 1-5, with 1 being the least, and 5 being the most, how much do you know about stop and frisk practices in Boston?”

  • If the respondent said three or higher, we followed up with the question:

  • “On the same scale of 1-5, how problematic are current stop and frisk practices in Boston?”

Examples

Affluent white Bostonians article focus group:

  • Then the interviewee would go through the website.

  • We used a screen recording the website to see how the reader’s engaged with the website.

Examples

Afterwards we asked them a series of informational questions:

  • How did this piece make you feel? (prompting questions if respondent asked for clarification: Did you think it was entertaining? Did you think it was surprising?)

  • What was your overall reaction to this piece?

  • How did the balance of humor and real facts work for you?

  • Did you notice the annotations and click on them while you were reading?

  • Could you tell the difference between the joke data and the real data?

  • Did you trust the real data?

  • Do you feel motivated to do something about this problem?

  • Which action at the bottom would you be most likely to take?

  • Did this article change your opinion of stop and frisk practices in Boston?

Examples

As a final information gathering mechanism, we installed Google Analytics on the site to track visitor behavior.