This little video is my response to the Creative Task for Chapter 4 (“Inspirational Week”)
The proposition for the task was this:
Take a camera, be it you mobile phone, a webcam… Introduce yourself to the other StoryMOOCers, telling us who you are where you are from and most importantly: which works inspired your interest in storytelling most up to know. Pick out 1-3 works of art, literature, film, TV, game, a website or else and tell us what’s so special about it that you think it might help inspire somebody else anywhere on this planet.
Yeah – I’m going through a DEVO phase again. I listen to their music all the time. Their voices and sounds are familiar, like visiting an old neighbourhood.
I get emails from Club Devo,and see snippets of mutated art from Mark M., photos from their irreverent, young new wave days, and so many artifacts of their gleeful, tongue-in-cheek self-promotion. Echoes of the back-of-the-comic ad, junk culture that they enjoy.
Every 6 – 12 months, something bigger than my playlist brings the Devoids to my mind in a more significant way. Something new bubbles up in the media. This time, perhaps it was the unfortunate death of their friend and long-time drummer, Alan. A very sad loss, indeed. Their own “human metronome”, the driver of their complicated, syncopated rhythms, was no more.
Gerry Casale started following my Twitter feed the other day, and it made me feel a little closer to the source. The more DEVO videos or interviews I watch, the more I read, the more they’re like citizens of some weird hometown – the guys who struck out a few years before my generation, and who did all the cool art that I wish I’d done.
I love this passage from the book “We are DEVO!” by Jade Dellinger and David Giffels):
In his book Fargo Rock City, rock critic Chuck Klosterman wrote that “Listening to (Eric) Clapton was like getting a sensual massage from a woman you’ve loved for the past ten years; listening to Van Halen was like having the best sex of your life with three foxy nursing students you met at a Tastee Freeze.” To extend that metaphor, Devo would be the equivalent of auto-erotic asphixiation, the sexual technique of partly hanging oneself during masturbation to achieve a more intense orgasm.”
(Having been to one uninspiring Clapton concert, I think that Klosterman likes Clapton a bit too much.)
Yeah, so DEVO is an acquired taste – not the flavour (or party favour) of the week.
But, yeah spuds, challenge me please. Make me think, or make me argue. If you can get me to write or think about what you’re saying, well, you’ve found that devolved nerve ending and twanged it nicely. And I thank you.
Clapton and all the 2nd-wave Brit rock gods, as incredibly talented as they were musically, never made me think about a damned thing. But the DEVO experiment got my attention, and they’re still doing it.
Robbie tells the story of a space-faring android who is the last occupant of a space station orbiting the earth. I could easily tell that this film was composed entirely of stock footage, but then again, how easy would it be to shoot your movie on the space station (or a realistic, earth-bound mockup). Nonetheless, the repetitive, stock footage appearance of it put me off a bit. Aside from that, Robbie is an engaging tale about survival, loneliness and angst from the perspective of an artificial intelligence.
I don’t know if 4000 or 6000 years of feeding its neural net with information would result in an android that would have dreams – literally flights of fantasy – and not for one moment did I buy the premise that Robbie wanted to be Catholic.
I’ll say that again: Catholic. I’m not anti-Catholic or anything, but such a specific choice of religion seems out-of-place. Is the author of this piece likening Robbie the Robot to Jesus, by virtue of his symbolic impending death (and do we presume, rebirth?)
My expectation of an autonomous, artificial intelligence would be that it would be somehow more neutral, probably atheist or maybe humanist. It either wouldn’t believe a religion or perhaps it would believe in the species which created it. Okay – so, I’m an atheist and I have a hard time with that aspect. I’ll leave that point alone, and get on with it.
No – I just cannot leave the religion aspect alone on this one…
The idea that a robot with what we consider to be A.I. would care about one religion over another probably says more about the film maker’s attempt to imbue his protagonist with some kind of “soul”, so that the viewer will empathize with him. “If the robot wants to believe in God, then he must be more like me than I thought. If he could consider accepting God as his creator, then he must have a higher level of enlightenment, just like a human.”
If, however, Robbie were to possess the actual mental engrams of a former human being – if a human being’s actual thoughts and personality could be transferred into Robbie’s memory and mechanical frame – then THAT would convince me to feel sympathy for Robbie’s plight (his curse of immortality).
But so long as I believe that Robbie possesses a 21st century version of artificial rationale, I can never consider him conscious, and so I will never accept him for much more than a glorified electric screwdriver left behind by a space workman. How cold-hearted am I? I just didn’t buy into this movie’s attempt to tug my heart strings.
Gumdrop was a sweet little comedy, and a gentle visual sleight-of-hand. By substituting a young human actor with an android auditioning for an acting job, we end up starting to think about the values and hopes of the young actress, mechanical or not. Gumdrop was a light-hearted examination of the casting call too: do we treat each other like commodities or machines? Does the audition process demean the female actor? Should human actors be worried, now that we live in a world where lots of supporting and lead characters only exist in an animation database, but never in the physical sense?
Gumdrop’s vacuum cleaner gag was very funny. But, does that mean she’s really just a glorified Rosy the Robot? What happens when the acting career is finished, or when she outlives her warranty? Will she get literally dumped on the scrap heap?
For some reason, I care about Gumdrop more than Robbie. Maybe it’s the human motion and voice. She’s much more likeable than Robbie. Like they said in Pulp Fiction, personality goes a long way.
True Skin is an extremely well-made, and convincing film. Very Blade Runner-esque. Great Raymond-Chandler-inspired dialogue. “Their eyes burned holes in my pockets” was a brilliant line.
So, the one thing all these films have in common is that they live or die by the quality of the plot and the dialogue. Yay, human writers!
In terms of the humanity proposition of this week, I think this film does the best job of articulating some major issues:
If there comes a time when we can no longer define or recognize humanity by its fleshiness, will it still be considered human? Is a cyborg who is less than 50% flesh and bone still a human being? Maybe the more metallic and less meaty we become, the less human we will be perceived to be. Ben Kenobi said of Darth Vader: “He’s more machine than man now, twisted and evil.”
On a personal level, if a friend of mine had their thoughts transferred into a little computer, and I could interact with them (either text, or maybe Max Headroon style on a display screen), would I still consider them human? Probably not, if I could put them into Standby Mode, or turn them off, like any other device. So, maybe autonomy and self-preservation are other key aspects of being a sentient being?
I loved Avatar Days. The simple concept of transplanting a fantasy persona into the owner’s real-world life and society is an extremely powerful thing. It’s done so matter-of-factly and carefully that it becomes a real artistic social statement. Coolest of all, it’s contemporary. You can get immersed in World of Warcraft or Second Life and become a sword-swing, spell-packing nerd of Azaroth today.
I’ve played around in Second Life a bit in the past (reporting as “Earnest Oh”), so I can appreciate the appeal of being able to put on that second skin and walk around (or remove it and assume the position, in a lot of people’s cases… yeesh, people). It makes you wonder about the boundary between fantasy and reality for one thing. I read somewhere, that internally, your brain does not distinguish the difference between a memory of a real event, and a memory of a dream. They’re both equally valid as memories, even if one of them didn’t occur in the physical world. So, if our brains are already wired to accept dream-memories as valid, why wouldn’t we send coma victims to Azaroth to kick some goblin ass as part of some cognitive stimulation therapy? At least they’d have something interesting to do.
What about The Matrix as Long Term Care Facility? Let me extend that interesting idea into my personal life experience…
My Mother was a long-term care resident at our provincial mental health hospital for many years. I’m willing to bet that if my poor Mum were able to choose between (A) stay in a semi-vegetative state with little physical activity and not much on TV, or (B) Be Dorothy in the Wizard of Oz (her favourite movie), she’d have gone for Option B and never looked back. And if I could have visited her on the yellow brick road instead of in the awkward, cold silence of a hospital visiting room, I know which choice I’d have made too.
My stream-of-consciousness explorations of MOOCs and MOOC-related online chatter brought me to the following article, from wired.com. I never really thought of a MOOC as being “edutainment” before, but I think it just might represent a social merger between mass education and mass entertainment, between social learning and social media.
More than that, the idea (below) that the author sees lifelong learning as a “continuous, on-the-job process” (e.g. vocational) seems to me extremely practical, possible, and a little too skewed towards commerce. IMHO, MOOC-based education has, at some level, been fueled by a business model, like it or not. It’s free – but not without some cost.
This article was written by a Marketer or a Market Analyst (read: business person) – not by an Educator.
MOOCs can be much more than marketing and edutainment. We believe they are likely to evolve into a “scale business”: one that relies on the technology and data backbone of the medium to optimize and individualize learning opportunities for millions of students.
This is very different than simply putting a video of a professor lecturing online.
The initial MOOCs came from a “process business model” where companies bring inputs together at one end and transform them into a higher-value output for customers at the other end — as with the retail and manufacturing industries.
But over time, an approach where users exchange information from each other similar to Facebook or telecommunications (a “facilitated network model”) will come to dominate online learning. This evolution is especially likely to happen if the traditional degree becomes irrelevant and, as many predict, learning becomes a continuous, on-the-job learning process. Then the need for customization will drive us toward just-in-time mini-courses.”
The MOOC I’m taking, E-Learning + Digital Cultures, continues to unfold in front of me, gradually showing me new perspectives and more detail. But it’s not for the impatient…
For me, being in a MOOC has felt like being seated inside a vast, unlit stadium where you can hear other attendees whispering and you can see their messages on the walls, but otherwise, they remain invisible. Getting acclimatized – even feeling welcome – does not come right away.
A few weeks later, this is still more or less my experience, but my eyes seem to have adjusted to the darkness now – I feel like I can see better and interpret more than before.
Gardner Campbell’s Open Ed 2012 keynote address hit me like a bolt to the brain… [It] made me feel inspired and energized to explore my own spaces between art, technology and learning.
In the Week 2 resources, under “Perspectives on Education”, the video of Gardner Campbell’s Open Ed 2012 keynote address hit me like a bolt to the brain: his passionate advocacy for truly open learning, his challenging definitions of what he felt it should be, and his support and appreciation for the interdisciplinary responses of his students – all of these factors made me feel inspired and energized to explore my own spaces between art, technology and learning. I think I may have found a new inspiration – someone to study more closely.
When I was in the Emily Carr College of Art + Design in the eighties, I learned about media theory (e.g. MacLuhan), multimedia and hypertext (e.g. Ted Nelson), and visual literacy and visual perception (e.g. Tom Hudson, Rudolph Arnheim, Johannes Itten). Some things I learned from reading books or watching videos, but a lot of information I got first-hand, from seminars, workshops and special research projects. The people I learned from in-person were all artist-educators who were actively exploring ideas through their own art practice or educational research, often using consumer tech on shoestring budgets.
Back in my days as a multidisciplinary art student and research assistant, my greatest personal challenge was to interpret and synthesize all the raw information, and later, decide how to express my experiences. Many of my extracurricular readings covered topics in AI, cybernetics, user interaction, and theories of learning and education. I was all over the place conceptually, and loved it. Science educators like Seymour Papert and Alan Kay caught my interest for their explorations with interfaces and user (student) interaction. I read about the MIT Media lab, and all its explorations into media, technology, art and science. I read articles from the ISAST Journal “Leonardo”, and learned about PhD-level multidisciplinary art and science research projects. A good deal of the theories and terminology was just over my head, but I had found an interesting, fertile territory to consider, in the intersections of art, education and technology. Convergence was just starting to happen, and it was a fascinating thing.
My multimedia instructor, artist Gary Lee-Nova, helped me understand the relationships between modern analog and digital media, perception and society. Gary talked about author William Gibson and the idea of cyberpunk way before it was popular. Research, exploration and personal development were fun back then.
My mentor back in art college, Dr. Tom Hudson, opened my mind to modernist Bauhaus art education patterns, and under his guidance, we updated and reinterpreted them by using desktop computer graphics programs to research visual literacy and drawing systems.
After graduating from Emily Carr’s four year diploma program in 1989, I opted to pursue computer graphics, animation or commercial design as my career path, instead of art education. Tom had, at some level, hoped I would continue pursuing art education as a career. I did teach computer graphics in night school for a few years, tutored art privately, and was an Artist-in-Residence in the Vancouver School Board, but I never went into education in a more formalized way, like by pursuing a degree.
After 20 years working in the commercial sector, bringing visual design services to software/hardware developers and business people, the exciting theoretical, creative aspects of my thinking felt as is they had atrophied and needed some dusting off. My Modus Operandi had become one of speed and economy: skimming the surface of the pond of ideas to get from questions to answers, and from initial request to practical deliverable, as quickly as possible. Any education I took from my graphics career was of a short-term, tactical nature. I learned what I needed in order to fulfill a particular short-term goal. In that kind of mode, there wasn’t much time or interest in theory.
Now, I’m employed in Vancouver’s largest vocational college, helping teachers to adapt their experience and materials into online courses. In a higher education institution, my perceptions and reactions have had to adjust to a more deliberate, thoughtful form of delivery: integrity over speed, and quality over quantity.
Now, it feels like I’m rediscovering the joy of the interconnectedness of ideas – a multidisciplinary approach to things. I’m fascinated to see some of the topical connections between Seymour Papert, Alan Kay and Gardner Campbell.
I can, and should, now enjoy taking a deep dive into topics, instead of just skimming the surface.
Themes explored this week included technological utopianism and dystopianism, and the idea of technological determinism.
I watched these videos:
Video: “Day Made of Glass 2” (Corning)
The “Glass as lifestyle” approach is somewhat corporate wishful thinking, IMHO, and relies too much on groovy futuristic sci-fi touch interfaces to make the glass medium look exciting. Tinting windows? Sure. Use my bedroom window to help me decide what to pull out of my closet that is only a few feet away? Fat chance.
A massive sheet of glass in the middle of a demonstration forest would never be that clean and perfect.
I’m sure it would also be dangerous for the wildlife (dead birds having crashed into it all the time = scary discoveries for young girls).
In the classroom, students are just well-behaved passive recipients of the Teacher’s initial presentation, with nobody raising their hand to ask a question or ask to go to the bathroom. In classrooms today that use interactive whiteboards, students are often encouraged to come to the front and move images around as part of the lesson. Why do presentation and participation (at the beautiful touch-table) need to be presented as a group activity? In the Corning classroom, students are depicted and treated mainly as one group/collective. Is this a (subconscious) corporate wish for collective harmony? It’s okay for the kids to pick their clothes or to colour Dad’s dashboard full of hearts – that’s harmless kid stuff – but beyond that, personal expression or individuality seem muted in Corningland.
The glass-based solar array on the school roof was a nice image, but they could have done more to humanize their mission, and embrace corporate social responsibility. Like, why not show a kick-ass interactive graffiti wall donated by Corning to some local Community Centre?
Also, why are the young girls private school students? Is that a value judgement about an educational utopia? Does that mean that Corning’s utopian vision would only be available to the upper class and rich medical specialists like the Dad? That would leave something of a dystopian “plexiglass” reality for the lower classes, I guess… 😉 Definite technological determinism there, not to mention class-ism.
Video: “Productivity Future Vision” (Microsoft)
In Microsoft’s vision, paper seems to have disappeared, replaced by flexible touch-sensitive surfaces. Hard for me to accept that. Paper will still remain cheaper than plastic, for at least the next 10 years and more ecologically friendly, forever. I noticed that keyboards are still around in Microsoft’s future vision, at least in the office when one is preparing the annual report (or whatever that dude was doing).
Apparently, nobody at home or work is concerned about any repetitive stress issues from having to do all those large arm motions to swoosh images around on all those massive interactiuve surfaces. How many overweight CEOs are going to throw their back out trying to clear all the virtual files off their ginormous desk-walls?
This idea that all surfaces will be interactive and high-res is completely fantastic – a utopian vision and obvious excuse to demo Microsoft’s Surface technology. It is technologically skewed towards the vendor-manufacturer’s wet dream of an ideal consumer family.
Themes explored this week included technological utopianism and dystopianism, and the idea of technological determinism.
I watched these videos:
This animation showed symbolically how cultures elevate and then scrap technologies, hoisting them to a high level of dominance, only to turf them in favour of the next big thing. The animation design style mimicked Javanese paper cutout shadow puppets, which was a very compelling choice, and lent a sense of tribal, primitiveness and other-worldliness to the characters.
This live-action comedy-drama uses the metaphor of magic paper bags and sticky notes to illustrate behaviours, interactions and expectations in social media (Facebook, primarily).
“Thursday” is a charming animation showing the tension and inter-relation between human modern electronic culture, and the natural world that continues around (and in spite of) it.
The design style of the animation evokes video games in its pixely appearance and representation of space (isometric projection and side-scroller” look and feel).
Thursday seems to be saying that we live in a vastly technological society, but the natural world is vaster still, and more persistent. The little mother blackbird adapts her song to the tunes she overhears in people’s cellphones and alarm clocks, steals a bit of wire to build her nest, and shelters her chicks in a satellite dish. Nature adapts.
Mankind borrows echoes from nature, putting little bird-like chirps into its mechanical tools – as an ancient comfort perhaps? Generally, it’s man who seems to be living with blinders on, surrounding himself with mechanical proxies for nature, and cloistering himself away from it in his dark, hive-like internal cubicle farms. Not until our human protagonist sees “the big picture” from space (and later when he contemplates the little crashed bird on his windowsill) does he seem to reconnect to his natural world.
Ultimately, the theme I saw here was freedom and survival of the natural world, alongside the structure and abstractions of the human digital culture. I think the true main protagonist of this little film are the birds.
Last year, I read an astute saying that said “If you didn’t pay to use a service, then you are the product being sold”. I feel like that kind of “buyer beware” maxim could be applied to ease-of-use in information technologies too. Here’s what I mean…
If a technology tool or platform is popular, we could say that, in part, because it’s easier to use than the competition, the usability aspect of its design was likely a core business strategy. Hardware designers might talk of “build quality” and ergonomics – it’s all about usability.
Today, usability is deeply integrated into product design and marketing. For example, let’s take the rise of tablet computing platforms – most popularly, the Apple iPad. Many users who are new, or technologically-intimidated, or very young or old, will likely have an easier time using a touch-tablet like the iPad than they would using a desktop computer. Compared to the user experience of manipulating a mouse and keyboard on a desk to manipulate objects on a screen, touching your finger to a screen on a tablet (primarily one that has an OS that is designed for touch use) is much easier for a new or unfamiliar user. You don’t have to “get used” to using a mouse (i.e. training yourself that a wrist movement of a few inches from left to right across your desk will translate into a one-foot left-to-right motion of a pointer on the screen in front of your face). This basic aspect of the windows-mouse-icon-pointer interface is actually a barrier to use: a new user must practice a little bit before they can easily manipulate graphical objects using a mouse.
In this regard, smartphone and tablet-based computing have been absolute game-changer technologies for many people. Apple and many other manufacturers knew this, and were waiting for touch-screen technology to become sophisticated and inexpensive enough to bring to the mass market.
These devices are used to access many free and for-pay information and media services. People don’t really think about the way it is – they just want to be able to use these devices – these new gadgets – to get at the news, music, movies, or games that they want. Corporations seem to have taken a cue from the original “information on the Internet should be free” ethos that evolved through the 70s, 80s and 90s, and subverted it by making books, apps and games available on tablets for only a few dollars, or even for free. Buying an iPad game that will give you dozens of hours of fun will cost you about the same as a pack of bubble gum. That’s one barrier gone. After you download it, you can use it right away – installation is usually fast and minimal. That’s another barrier gone.
From a business perspective, making a platform easier to use (usability), and making the purchase process easier to complete (one-click fulfillment) and easier to justify (cheap or free) will easily result in more purchases. Amazon’s “One-click” purchase button was the first place I saw this kind of supermarket checkout “impulse purchase” tactic at work. I had disposable income, and Jeff Bezos and Amazon made it extremely easy for me to dispose of it on a whim. I could “impulse buy” a thirty dollar hardcover book with even less effort than it would take to grab a candy bar at the checkout aisle at Safeway. Tablets with apps and books that can be bought for under a dollar, while you’re laying in bed at night, are about as convenient and impulsive as it gets.
It means that the end-user consumer must exercise some discretion and will power to avoid nickel and diming themselves down to a negative balance in their bank account. A high degree of usability in the device itself makes for a pleasing and satisfying user experience, and ubiquitous cheap online products in a “one-click marketplace make it deceptively easy to please the vendors.
So, if it’s too easy to use, be careful. You might use it too often.
Today, I enjoyed a visit and stimulating discussion with one of my earliest art school teachers, John Wertschek, currently an Associate Professor at the Emily Carr University of Art and Design in Vancouver.
I suppose that many of John’s former Foundation students would probably agree that he has, in one way or another, challenged them to imagine the previously unimaginable. Certainly this was the case for me in John’s “3D” course, back in 1985 at what was then called the Emily Carr College of Art + Design.
I remember “The Rock Game”, an “exercise” (for lack of a better word) situated in a low-lit room on a table that was surrounded by mostly high-school-aged young people. On the table was a collection of rocks of varying sizes, which each participant would take turns moving or re-orienting. That was the whole thing. The Rock Game could be called a “no rules” game, but it required reaction, space, material and personal decisions, so although “rule-less” it might have been, it was not without structure or outcomes. Very zen, or whatever. 🙂
As an eager 19 year old who wanted to experience many new things, my take-away from that simple little game was “pay attention, feel, respond, and act for yourself”.
John liked to use words – their meanings, origins, sounds and similarities – to illustrate and challenge patterns of thought. Sometimes the challenge was a visualization and/or a creative thought experiment, such as “build a device with which to weigh a dragon”.
I also took away one deceptively simple piece of practical advice from John: “The two most important books you’ll ever use in your life are the Yellow Pages and the Dictionary.” Something in that advice told me that the door was open for me to go through, the resources and information out there if I looked for them, and that I should give myself permission to act when I needed to. (What the hell was I waiting around for anyway?)
So today, my dictionary and my business directories are Wikipedia and Google, and if I still have rocks to move around, they’re metaphysical or more often than not, composed of pixels. But the personal process contains a similar proposition: make a move, and do it with intention and integrity.
Today, John told me that back when I was doing my Foundation year, about 50-60% of the students were fresh out of high school, and that now, the number is more like 85%.
For a young generation of digital natives, acclimatized to immediate, packaged information and real-time access to a thousand opinions and personas, it makes me think that the kind of face-first, open-ended explorations which can cause you to question, reflect and think for yourself are now more important than ever.