Posted by: soniahs | November 23, 2010

Exam readings: using visuals to understand science

More on the use of diagrams in understanding science. First, a paper which suggests that a key process of science education is a process of learning how to take observations, make diagrams (or other descriptions), and then communicate about those diagrams with other people. This is a relatively simple concept, but one which is often not emphasized in science education (at least, it’s not emphasized how these skills will help students learn science. The second paper is tangentially connected to this idea. It’s about the challenges in incorporating data visualization tools into community science projects- tools that many scientists have no trouble interpreting, but that members of the public do.

Wolff-Michael Roth and Michelle K. McGinn. “Inscriptions: Toward a Theory of Representing as Social Practice.” Review of Educational Research. 68(1): 35-59, 1998.

Summary: The authors use the concept of inscriptions (=physical graphical displays; distinct from mental representations) to argue for a social, rather than purely individually cognitive, view of activity. Their focus is on emphasizing the conscious consideration of inscription-creating practices during science learning; I’m skipping the discussion of pedagogy/classroom practice. Inscriptions are used in several ways in discussions: talked about, talked over (e.g., used as backgrounds), serve as boundary objects for discussion among different groups, have rhetorical functions (demonstrative), and serve as pedagogical devices. Inscriptions are materially embodied signs: mobile (immutable while moving); can be incorporated into different contexts, rescaled, combined, reproduced easily; can be merged with geometry (i.e., mathematicized/ gridded); and can be “translated” into other inscriptions. The relationship between inscription and inscribed is traditionally thought of as correspondence or “truth;” current thought is that inscriptions are a result of distinct social practice, so distinct from the thing inscribed. Inscriptions’ creation practices determine whether they’ll be accepted by a community; this is grounded in social practice and suggests that inscriptions can’t be properly interpreted outside the context of their use. Also discuss their use as boundary objects with different functions in face-to-face vs. dispersed settings (though they mention that networked presentation tools are allowing a fuller range of discussion using inscriptions among dispersed groups).

Comments: Focus is on formal education environments and framing science practice as a series of creating, interpreting, and sharing inscriptions. Their background discussion helps tie together some of my other readings on communities of participation, distributed cognition, and visualizations.
Links to: various things…

Stephanie Thompson and Rick Bonney. “Evaluating the Impact of Participation in an On-line Citizen Science Project: A Mixed-methods approach.” in J. Trant and D. Bearman (eds.) Museums and the Web 2007: Proceedings, Toronto: Archives & Museum Informatics, published March 1, 2007.

Summary: Report on assessment of participant use of eBird, Cornell Lab of Ornithology online bird sighting tracking software. In eBird, participants enter information about their bird sightings either from a list or on a map; this data is then pooled with other observations. Users can use several tools for data visualization of all bird observations, either selecting one species to focus on or selecting all observations from a particular area. Tools include maps and various types of charts. This project has educational goals, but in entirely self-instructed and –directed (instructions and a FAQ are available). In 2005, CLO conducted a new user survey, which surveyed users on registration and again eight weeks later; this included a standard demographic questionnaire, an assessment of users’ understanding of the “View and Explore Data” tools, and a “Personal Meaning Mapping” about conservation (a short-answer assessment approach). For the data analysis tools, they found that most users who responded didn’t select the correct tools to answer the question asked. In addition, many people didn’t answer this question, probably because they hadn’t used or weren’t comfortable with these tools. The authors suggest that more active instruction in how to use the tools is probably needed.

Comments: This paper mainly presents an example of the challenge in incorporating data visualization tools into an informal learning setting.



  1. […] to: Lave & Wenger (comm. of practice); Roth & McGinn (also focus on science learning via representations); Scardimalia & Bereiter (using computers […]


%d bloggers like this: