1
Project Schedules
2
Audience Impact
3
Studio Time
11/13: Audience impact
11/15: Enhancing graphics
11/20: Class wrap up
11/27: Presentations
11/30: Presentations
Your final project needs to be live by the presentations (11/27). You can negotiate with me what this means.
The presentations should ideally have visual representation in slides
You have 5 minutes to present, and then time for questions.
Along with your slides for your presentation, you will submit a final write up of:
A 300-500(ish) word write up on how you found your data, cleaned it, analyzed it, and synthesized it into a story. You need to be specific in your description of the data.
A 300-500(ish) word “impact” write up that includes:
For both Johnston and non-Johnston students there are two pieces that matter for my assessment of your performance:
Your completion of in-class (“data sketches”) and out of class (“homeworks”) work. These are P/F.
Your group project:
For non-Johnston students, our syllabus states that your grade has a roughly 50/50 weighting:
Did you complete all steps of the project (did you submit what I asked on Forms)? P/F
Did you complete the write up and presentation? A/B/F
Assessment of the project itself (see next slide)
Assessment of your contribution to the group. A/B/F
Criteria | Excellent (4) | Good (3) | Satisfactory (2) | Needs Improvement (1) |
---|---|---|---|---|
1. Appropriate Data | The chosen data is excellent, demonstrating a clear understanding and relevance to the project goals. - Thoroughly cleaned and prepared for analysis. | The chosen data is good, with a clear understanding and relevance to the project goals. - Adequately cleaned and prepared for analysis. | The chosen data is satisfactory, but there may be some gaps in understanding or relevance. - Cleaning is basic and may have minor issues. | The chosen data is weak, with limited understanding or relevance to the project goals. - Cleaning is inadequate or incomplete. |
2. Clear Data Story | The data story is exceptionally clear, insightful, and effectively communicates the chosen narrative. | The data story is clear, insightful, and effectively communicates the chosen narrative. | The data story is clear but may lack depth or originality in insights. | The data story is somewhat unclear or lacks substantial insights. |
3. Connection between Story and Method | The method of communication demonstrates a seamless and logical connection to the data story. | The method of communication has a clear and logical connection to the data story. | The method of communication has some connection to the data story but may lack clarity. | The method of communication has a weak or unclear connection to the data story. |
4. Execution of Communication Method | The chosen method is executed exceptionally well, adhering to principles of aesthetics and perception discussed. | The chosen method is well-executed, following principles of aesthetics and perception. | The execution of the chosen method is satisfactory but may have minor flaws. | The execution of the chosen method is weak, with noticeable flaws in aesthetics or perception principles. |
5. Clearly Defined Audience | The audience is clearly defined, and the project effectively targets them with precision. | The audience is well-defined, and the project appropriately targets them. | The audience is mentioned but may not be fully developed, and targeting is somewhat vague. | The audience is unclear, or targeting is not well-aligned with the project goals. |
6. Assessment Design | The assessment design is comprehensive, effectively gauging audience engagement and understanding. | The assessment design is good, providing valuable insights into audience engagement. | The assessment design is basic but provides some insights into audience engagement. | The assessment design is ineffective or lacks relevance to the project goals. |
How would you edit the project assessment rubric? What would you add/remove?
Feedback wall
Pledge or feedback tree
Poll
Comment cards
Vox pop – a quick question or two asked by a person (or online) to gather data.
Pre/post quiz or feedback
Click analysis
Action/donation numbers
Focus group
Pre and post interviews
BeeSmart video focus group:
Before questions:
Which of these factors do you think affects food production the most?
Where do you think most honeybees in the US live? (Select state)
After questions:
After watching this video, does your answer to question 1 change?
Drought
Bee colony collapse
Global population increase
Crop diseases
GMO
Did you learn something new about honeybees in America?
After watching this video, if someone asked you to sign the petition would you be more willing to do so?
Affluent white Bostonians article focus group:
Pre-question:
“On a scale of 1-5, with 1 being the least, and 5 being the most, how much do you know about stop and frisk practices in Boston?”
If the respondent said three or higher, we followed up with the question:
“On the same scale of 1-5, how problematic are current stop and frisk practices in Boston?”
Affluent white Bostonians article focus group:
Then the interviewee would go through the website.
We used a screen recording the website to see how the reader’s engaged with the website.
Afterwards we asked them a series of informational questions:
How did this piece make you feel? (prompting questions if respondent asked for clarification: Did you think it was entertaining? Did you think it was surprising?)
What was your overall reaction to this piece?
How did the balance of humor and real facts work for you?
Did you notice the annotations and click on them while you were reading?
Could you tell the difference between the joke data and the real data?
Did you trust the real data?
Do you feel motivated to do something about this problem?
Which action at the bottom would you be most likely to take?
Did this article change your opinion of stop and frisk practices in Boston?
As a final information gathering mechanism, we installed Google Analytics on the site to track visitor behavior.
Econ 255 - Data Storytelling