Course reflection

May 12th, 2010

As I read my classmates’ reflections, I saw that several went back to their first post to see whether their course expectations had been met.  I think I’ll do the same, although my expectations weren’t quite as well-defined as some of theirs.

I expected to encounter new technology and ask three questions in evaluating it:

  • Is this tool one that the average teacher can master easily?
  • In what specific, practical ways will this tool enhance learning?
  • What is the cost of investing in this tool?  What is the cost of not investing in it?

I think we, as a class, did answer these questions each time we worked with a new digital tool.  And, amazingly enough, almost all of the tools received a “Yes” to the first question!  I know Second Life was challenging for some, but I think most of the issues came up when something “broke.”  I know that was the case for me.

We were able to enumerate several affordances for classroom use for every technology.

Finally, the big question:  What about cost?  Of course, so much comes down to money.  That leads to discussions about equity and accessibility, problems I doubt will disappear in the near future.

But one thing we did find, which was interesting to me, was that every time we discussed how to integrate technology into the classroom, the issue of time and standards testing came up.  Learning and teaching new methods takes time, which is very precious now in the world of high-stakes testing.  So, educational reform seems to be inextricably linked to the introduction of enriched learning environments made possible through technology.

I really enjoyed the class, and, yes, found it academically challenging.  We talked about “playing” with robots and Second Life, but I don’t think having fun negates the challenge.  My favorite thing was making the video for the wiki.  But I also enjoyed blogging (especially when I received comments) and reading others’ blogs.  I always learn so much from my classmates.  I got into Twitter for a while, but don’t think I’ll go there much in the future.  I’m also glad I learned about aggregators — they make visiting blogs so much easier!

Thank you to all my classmates and Karen.  You made this a great first semester for me!

The real value of simulation

April 19th, 2010

After many pages of pointing out the limits, and even dangers, of simulation in scientific research, Sherry Turkle ends her book, Simulation and Its Discontents, with a suggestion for the true value of simulation.  She draws from many disciplines, but presents the same theme:  the errors that inevitably emerge from simulations are the very data we should value.  Scientists from fields as diverse as protein crystallography, nuclear weapons design, and astrophysics all chime in with this assertion.  Simulation, they say, is beguiling because it can produce beautiful images of worlds that can never be.  However, when those images are evaluated against the real, we can finally begin to learn something about both.

The weapons designer, Dr. Adam Luft, says, “’Simulations are never right.  They’re all wrong.  Forget it.  That’s it.  They’re wrong.  Guaranteed.  There is more entropy in the real world then [sic] there is in your computer.  That’s just the way it is’” (p. 81).

Dean Whitman, a biologist, further asserts that “you need a simulation to produce error so that you can test it against reality, to figure out how it is wrong.  If you get the simulation right, you will never understand how it is right.  You need it to be wrong and you need to figure out how it is wrong” (p. 82).

The consensus seems to be that simulation does have a place in research and design.  However, the designers, scientists, and engineers should remain vigilant against their own laziness and love of the “glitzy” when using simulations.  The best simulation is the one that produces something a little ugly, a little incomplete, and a little off the mark.  That allows, or perhaps forces, the user to go back over the plans and calculations to produce something real that is, in the end, much closer to the ideal.


Turkle, S. (2009). Simulation and Its Discontents.  Cambridge, Massachusetts: The MIT Press.

Safer — and more dangerous

April 12th, 2010

I’ve now come to the final chapter of Sherry Turkle’s book, Simulation and Its Discontents, and, as usual, there is so much in a single chapter that I’ll have to post at least two blog entries about it.

In the first part of this chapter, Turkle takes us into the world of nuclear weapons development.  Until now, her descriptions of danger and erroneous calculations related only to the worlds of architecture, physics, and engineering.  Of course, if a building or bridge collapses because someone relied too heavily on simulation, that’s a big problem.  But the thought of erroneous calculations during the production of a thermonuclear warhead – that’s another thing altogether.

But the same ambivalence appears among these weapons designers as does among the architects of earlier chapters.  On the one hand, testing weapons in simulation is far safer than actually exploding a bomb:  no radiation, no fallout, no blast zone.  On the other hand, the simulation removes the impact – literally – that a real explosion has on its witnesses.  It fails to inspire the same awe and respect.

One weapons designer Turkle writes about “laments that he has only once experienced ‘physical verification’ after a nuclear test.  He had ‘paced off the crater’ produced by the blast.  It changed him forever.  His younger colleagues will not have that.”  This is because, in 1992, the U.S. banned all nuclear testing.

As ecologically devastating as a nuclear test is, the fact is that today the weapons are not actually tested.  They are detonated only in simulation.  The results basically are extrapolated from and compared to previous data from real explosions.  But the truth is that designers without any first-hand experience are producing weapons of awesome power without any assurance (in the real world) of what they will really do.  Is this really safer? 


Turkle, S. (2009). Simulation and Its Discontents.  Cambridge, Massachusetts: The MIT Press.

Abdication and isolation

March 30th, 2010

As I continue with Sherry Turkle’s book, Simulation and Its Discontents, I find more and more to consider with regard to the limits of computers and even the potential harm they may cause us if overused.  Remember, I’m a child of the computer age.  I remember life before PCs and CDs, but I never struggled with including computer tools into my life.

However, two points Turkle makes give me reason to pause.  First, the tendency to abdicate control to the computer.  For some, control is not abdicated, but wrested from them by bosses and processes that demand they learn and use the latest technology.  But others give their authority away to the machines; I have to admit I could easily be one of these.  I could be like the architect Turkle describes, who failed to recheck the data he entered into the computer and ended up with a building foundation that was completely wrong.  Or I could be like the contractor on that same project who always rechecked the numbers when the drawing was produced by hand but never thought of doing so when the computer generated the plans.  Both men knew the computer only manipulated what it received, but somehow they both gave in to the false belief that computers can correct human error.

The second point in this section was the rise of isolationism among design professionals, largely due to the use of computers.  In many design firms, roles have become compartmentalized with some doing hand drawing and others entering data into the computers.  The two sides are supposed to collaborate, but neither side feels the other can really understand it.  The same happens with outside partners, like craftspeople providing materials or constructing the buildings.  They once felt intimately connected to the project they worked on because of the relationships they maintained with the architects.  Now, those relationships have grown progressively more shallow, and some designers place the blame squarely on the use of computer-assisted design.

Into what other fields can this sense of compartmentalization and isolation extend?  In education, we wrestle with including technology or convincing others to include technology in what has always been a very low-tech profession.  In order to teach, all you really need is a teacher and a student.  No books, no board, no writing instruments, and definitely no computer.  But we do have them and we do use them.  How can we guard against abdicating our authority to the machine?

Surprised by Sim

March 26th, 2010

I’m posting an intermediate blog this week to talk about my first class in Second Life.  I was both interested and anxious as we approached the class meeting:  interested because I’ve enjoyed games like The Sims and Myst, but anxious because I was afraid there would be an awkwardness to meeting people I knew in real life under assumed avatar names.  It just seemed a little weird to be mixing buisness and pleasure, so to speak. 

In fact, part of the assignment leading up to the class was to activate and design an avatar, tweak the costume, pick a name, and even roam around the sim world to get my “virtual legs.”  However, I found other assignments from other classes pressing this homework into the background — maybe they just seemed more legitimate, I don’t know.  I felt strangely guilty about sitting in front of the computer with a gaming program running (I know, it’s not really just a gaming program) while my husband cooked dinner or my daughter grappled with her own homework.

The great surprise was, not how much fun it was, but how much I learned.  The best thing was feeling comfortable within minutes of arriving on our educational island.  Even though I knew the person behind most of the avatars, the interface wasn’t as weird as I had imagined.  And, to be honest, the virtual quality came in handy when we were all shooting arrows or drumming around the campfire while our professor and speakers were still “lecturing.”  You just can’t do that in real life!

Yes, we experienced technical difficulties.  At one point I wandered off to explore, and when I returned no one could hear me nor I them (thank goodness for chat).  In the course of trying to reconnect, I actually disconnected my hair from my head.  I’ve heard bald is beautiful, but it was frustrating all the same.  I did reconnect with the group (as well as with my hair) and teleported to Africa, the Sistine Chapel, and a Renaissance village.  All in all, I loved the experience and look forward to more SL meetings and adventures.

The economics of science

March 21st, 2010

Sherry Turkle, in the second chapter of Simulation and Its Discontents, offers so many of what I could call “pearls,” that I find it hard to choose what to comment on here.  She describes the culture of four departments at MIT in the 1980s, just as computers were coming into common use in the classroom.  Those departments all had similar anxieties when faced with the new technology, but each (architecture, engineering, chemistry, and physics) accepted computers to varying degrees and for quite different purposes.

However, one thing that resounded with me, one thing that wasn’t really in the foreground of Turkle’s narrative, was the influence of economics on the decision to accept or reject computer technology for learning and practice in a particular field.  The word that alerted me was “artisanal.”  Several of the faculty in the school of architecture felt they could preserve at least some sense of handmade designs by requiring their students to “soften” the computer drawings, using colored pencils for enhancing and filling in details.  This was called “artisanal compensation.”

I started paying more attention to the descriptions of the conflict between proponents of handmade and machine-made  designs.  Sure enough, time emerged as a big factor.  Students spoke of how computers “made it possible to move rapidly through a series of design alternatives” (p. 12) in the amount of time it would take to produce only one design by hand.  Artisanal quality takes a lot of time; computer simulation takes less.  Although both faculty and students could see the limitations of computer designs, even in critical areas like computational accuracy, the press to produce more in less time is, apparently, just as unrelenting in science as in the fields of business and manufacturing.  The competition for dollars, whether from clients or grant funds, seems to permeate the classroom as it does the larger society.

I think I identify most with the architects in Turkle’s writing, especially with their love for the handmade.  They speak of the intimate knowledge of a project that comes from designing without computers.  When I make something by hand, like a pair of socks, people ask me why I don’t just buy them from the store.  Then, when they find out I not only knit the sock but also spun the wool myself and helped shear the animal it came from, they begin to look at me like I’m crazy.  However, I know more about a sock now than I did before I made one by hand.  I also know more about twist and tension and loft in yarns than I did before I started spinning.  And the satisfaction that comes from wearing something I made from scratch is hundreds of times greater than could come from a store-bought item.  But I also know that I can’t compete against the machine-driven sock industry.  I probably can’t even keep my own feet covered through hand knitting, given the amount of time required to produce just one sock.  So, I also bow to the economics of the machine, knitting only for pleasure instead of necessity.  Maybe modern architects must also content themselves with drawing for pleasure and using computers for more lucrative design.

By the way, my last pair of socks came from an alpaca named Chip, who lives on a farm in southern Illinois.

Turkle, S. (2009). Simulation and Its Discontents.  Cambridge, Massachusetts: The MIT Press.

What does simulation want?

March 14th, 2010

This week I started reading Sherry Turkle’s latest book, Simulation and Its Discontents.  The voices of discontent are important, she says, because they remind us that the virtual world, while useful, is also beguiling.  Beguiling to the point of overpowering us.  The discontents in her examples point back to the days when engineering and design was done with a lot of brains and intuition; the tools employed way back then (circa 1970 and earlier!) were mechanical instead of electronic.  Today, engineering and design is done with lots of brains and electronic tools – computers and virtual models have supplanted slide rules and cardboard scale models of skyscrapers.

But the discontents worry that the new tools are not just more advanced versions of the old ones.  The dimension that simulation adds to the creative process seems to be, quite literally, a paradigmatic shift.  Turkle plays on the famous question architect Louis I. Kahn once asked (“What does a brick want?”) when she asks, “What does simulation want?”  In other words, how is this tool begging to be used?

Turkle responds by contending that simulation wants immersion.  The only problem is that, once immersed, we are not likely to doubt the virtual environment or see its limitations.  She writes, “Sometimes it can be hard to remember all that lies beyond it, or even acknowledge that everything is not captured in it” (p. 7).  I suppose the only guard against that is precisely what Turkle offers in her book:  reminders to look beyond our borders and to somehow remember the context of our current condition.

Turkle, S. (2009). Simulation and Its Discontents.  Cambridge, Massachusetts: The MIT Press.

The hidden cost

February 21st, 2010

After reading Chapter 4, “The Promise of the Computer,” in Larry Cuban’s book, I was struck more by someone Cuban quoted than by Cuban himself.  As a preface to the chapter, Cuban includes this comment from Dale Peterson:  “But the . . . question is whether these computers will make any difference in the education of our children.  When my daughter graduates from high school in the year 2000, will she have received a better education with the help of computers than I did without them?” 1 Peterson asked that in 1984.

I feel frustrated when confronted by this misunderstanding of education’s relationship to real life.  Peterson seems to imply, “I didn’t learn with computers, and I turned out all right.”  And even Cuban, who to this point has asserted again and again that technology is a tool rather than the essence of education, now seems cowed by the advent of the microcomputer and the unknown effects it might have in the classroom – effects like isolating students from teachers and creating preschool-aged automatons.

Education uses exposure to ideas and experiences, along with the development of critical thinking skills, to give a person a foundation for surviving in and contributing to his/her society.  Technology, electronic or otherwise, is simply a tool for that process.  So, really, the value of education is a function of need.  Peterson didn’t need to know about computers; his daughter probably does.  I return to the three questions I postulated in my first entry on this subject.  The third asks:  What is the cost of investing in this [technology] tool?  What is the cost of not investing in it?  The cost of not investing in educational computer technology could be devastating.

In our society, indeed, in our world, computers are a part of real life.  Hardware will improve, software will be updated.  Nevertheless, students need to learn how to learn with digital tools.  They may never become programmers or computer engineers; that’s fine.   Yet the digital divide is real; if today’s students have no digital competence, they will have impaired survival skills.  And that certainly would be a censure of the quality of their education.


Cuban, L. (1986) Teachers and machines: The classroom use of technology since 1920. New York, NY: Teachers College Press.

Notes:   1. p. 72

What makes teachers tick

February 13th, 2010

In this third commentary on Larry Cuban’s book, Teachers and Machines, I have to say that, finally, we have come to the heart of the matter.  Cuban attempts to explain teacher behavior relating to the use, or non-use, of new technology.  Although many tech boosters, as he calls them, assign both names and blame to teachers who are reluctant to use new technology, or are simply unconvinced about its relevance, Cuban identifies two important factors that probably explain teacher behavior more than anything else.

First, he says, the classroom setting itself to a great extent dictates how teachers teach.  Teachers have restricted space, limited time, and sometimes a huge number of students with whom to interact.  Since most teachers really are trying to help as many students progress as much as possible, any tool or method must be simple, reliable, durable, and versatile.  Apparently, radio, film and television did not meet those criteria.

Also, the culture of the profession tends to select individuals who are very much like their own teachers were.  I’ve heard it said often that we teach as we were taught.  We also tend to teach in ways that we ourselves learn best.  Although teachers can, and do, make a deliberate effort to adjust delivery methods to match student needs, it only makes sense that they will as a group change slowly over time.  Unfortunately, sometimes this is more slowly than policy makers want.

The conclusion here, from both Cuban and probably anyone who knows (and remembers!) what it is like to be in the classroom, is that teaching is an art.  And while teachers may use machines, they themselves are not machines.  Neither are the students.  No amount of tweaking or tinkering with tools will change the human element in education.