Learning and Technology

Mystery vs. Knowledge

Between religion, mythology and even in everyday expressions, we are warned against the perils of knowledge and the pursuit thereof. In the Christian Bible, Adam and Eve doomed mankind after consuming forbidden fruit that contained the knowledge of self-awareness. In Greek mythology, Zeus created the first woman, Pandora, and gifted her with a box that (unbeknownst to her) contained evil. Of course, that evil was unleashed into the world after Pandora opened the lid to look inside. We are reminded that “ignorance is bliss” and “curiosity killed the cat.” Yet, if we live in a culture where seeking answers is considered dangerous, why then do we rely so heavily on technology to replace mystery with knowledge?

Technology can be likened to a modern oracle: an omniscient, omnipotent force that the majority of us have at the disposal of our fingertips. With its educational, encyclopedic, research sites, the Internet can be leveraged as a point of reference. Social media not only connects us to others, it also acts as an extension of our identities – revealing to friends, family and strangers alike our location, age, background, marital status, interests and dislikes – all without the need for us to speak or meet. As long as there’s a GPS signal, we can find our way to just about any destination, often “blindly” by complacently obeying the directions we’re given instead of consciously navigating.

The scope of our individual intelligence has shifted with the advancement of technology. We haven’t become smarter, perhaps only more resourceful. That resourcefulness is achievable by access to technology. Even then, one is only as resourceful as their ability to properly use any given technology. In an article for Psychology Today, Tomas Chamorro-Premuzic outlined two components of intellect and how they are utilized: fluid and crystallized intelligence. Crystallized intelligence represents what we actually know (i.e.-information we can reference offhand from memory) which is dwindling as our fluid intelligence, or our ability to gather and process information (i.e.-when we use search engines to answer a question) has grown alongside technology.

IMG_20150531_095405 (2)

As a teen in the late 90’s, when the Internet was just beginning to connect people, and the notion of social media was still a ways off, I used to truly pine for guys I had a crush on. Without technology, I took what was available, be it a brief conversation or a look, and filled in the gaps with my imagination. After falling for a co-worker at Toys ‘R Us, I took note of when he’d come into the store to pick up his paycheck. If I was scheduled to work that day, I would station myself close to the entrance of the store so I could strike up a conversation with him. The interaction would last all of 30 seconds to maybe a minute, but in the time leading up to that moment, I would feed off the wonder of what he was thinking, what he was up to, who he truly was. After he would leave, armed with that 30 seconds of further information, I’d analyze and reanalyze what was said and what it could mean.

It’s funny and embarrassing to look back on that time, but I do so with fondness. When we meet someone who interests us today, it takes little to no effort to find out seemingly everything about them due to the technological imprint we make. A friend recently mentioned that she had to explain to her daughters what a blind date was because this arrangement rarely occurs anymore. Even if someone wants to introduce us to a friend, all we have to do is view the matchmaker’s Facebook profile and scroll through his or her friends to locate the bachelor/ette they’d like to set you up with. By discovering so much before even meeting, are we taking away the chance to get to know one another face to face instead of face to Facebook? Although knowledge is power, is there such a thing as too much knowledge?

In his brilliant article for Wired magazine (that already dates back to the year 2000), computer scientist and co-founder of Sun Microsystems, Bill Joy, reflected on Why the Future Doesn’t Need Us. In the article, Joy commented on how technology has advanced our society’s wellbeing in fields such as medicine, while simultaneously advancing humanity’s demise with perpetually more sophisticated weaponry, warfare, hackers and the like. “Can we doubt that knowledge has become a weapon we wield against ourselves?” Joy asked.

This subject has been at the heart of exploration for years, in books such as Aldous Huxley’s Brave New World and George Orwell’s 1984 alongside philosophical works such as Jacques Ellul’s The Technological Society and dystopian films (many based on books) such as, 2001: A Space Odyssey, and A.I.. In each of these works, it is easy to see how the benefits of technology continued advancing until the point of evolving past human control, thus becoming too powerful to stop. As Bill Joy noted, “We [scientists] have long been driven by the overarching desire to know that is the nature of science’s quest, not stopping to notice that progress to power and more powerful technologies can take on a life of its own.”

Necco's sense of curiosity tends to center around good deals, his next meal and when is his mom going to stop writing and give him a cuddle.

Necco’s sense of curiosity tends to center around good deals, his next meal, and when his mom is going to stop writing to give him a cuddle.

The wheels (or should I say the 0’s and 1’s?) of technological advancement are in motion and have been for years. While extremists such as the Unabomber, Ted Kaczynski, resorted to violence in an effort to draw attention to, and protest of modern technology, others such as Joy and Ellul remind us that as humans with free will, we still have a choice, and can choose to use (or limit our use) of technology in socially responsible, non-violent ways. In the pursuit of knowledge, what we learn – whether it be about history, a location, others or even ourselves – isn’t always pleasant, or what we had hoped to find out. Nonetheless, with knowledge, we are able to move forward, either by accepting what we have learned or by determining that we would like to make a change. The alternative, the unknown, can be intriguing, but paralyzing.

Old-fashioned Challenge: Go out in the world and actively pursue knowledge, whether it be by attending lectures or meeting someone face to face instead of exchanging texts. Having concrete experiences can leave a lasting impression, which may just help to crystallize our intelligence and allow us to continue to be creatures of thought rather than creatures of Wi-Fi.

© Tia Gargiulo, 2015

Question: I’ve been kicking around the idea of unplugging from all portable technology for a day, but would love to have others participate and meet afterwards to compare experiences. Would that be of interest to anyone? Those outside the Seattle area could participate as individuals or as hosts for another city, if they’re so inclined. Leave a comment below or drop a line at: tiagargiulo@hotmail.com to let me know your thoughts. Depending on the response, I’ll come up with the particulars for when, how, etc.

Life Is but an (Augmented) Dream

Imagine you go to a party. On the way in, you put on a pair of “smart glasses”: a self-contained, cordless computer. Once the glasses are on, you experience the party, guests and environment as you would normally, while simultaneously experiencing an augmented reality. One wherein you see great works of art, regularly housed in museums and private collections, but tonight, it’s available for you to study from every angle and beyond barriers. Except it’s an illusion; a hologram.

The above scenario was described to me by my stylist. It turns out one of her clients works for Microsoft. As one of many developers, he’s currently engrossed with the production of HoloLens, a platform marketed as a level beyond virtual reality because of its blending of holograms with one’s real time environment. As my stylist talked about the technology, I imagined the film Vanilla Sky, and the swanky bachelor pad Tom Cruise’s character used to entertain guests with holograms of John Coltrane playing music. The juxtaposed reality in the film shook me up when I first saw it years ago. The fictional 2001 film vs. a real product in present time was particularly overwhelming to ponder while waiting for a fresh coat of hair dye to take root.

Party décor use was just one of HoloLens’ astounding, seemingly infinite potential applications. Beyond home entertainment and gaming, demos of the “holographic computing platform” show users teaching, learning, communicating and designing. Whether working, playing or connecting, HoloLens can impact every area of life. The seamless integration of the real world with that of an alternate reality offers users the ability to create (whether it be for fantasy or reality) without physically leaving the room they’re in.

Death doesn't stop John Coltrane from performing as a hologram in the film Vanilla Sky.

Death doesn’t stop John Coltrane from performing as a hologram in the film Vanilla Sky.

In many ways, this seems like a logical, natural next step in technological progress and advancement. Between Bluetooth devices emanating familiar voices in our ears, Skype and other video chatting services displaying loved ones right in front of us – even from states or countries away, the internet readily accessible to give us a preview of menus before stepping foot into a restaurant, trackers to help us find the quickest methods and routes to take alongside GPS navigating us to our destinations, weather updates so we need not peek out the window to know we’ll need an umbrella, sunglasses, or jacket, instant messaging and texts cropping up in rapid fire succession to keep us connected, and music, movies, books and other streaming media washing over our senses at all times – well… suffice it to say, we already exist in a self-contained technological bubble. At the moment, all our apps and devices are ones that are separate from our beings. Though we rarely do, we can still put down our phones, disconnect while at work or visiting. With the incorporation of HoloLens technology, are we losing ourselves to augmented reality?

As explored repeatedly in this blog, technology connects us, bringing us closer together, but it also alienates and pushes us further apart from one another. Will technology such as HoloLens bring us one step closer to completely cutting ourselves off from direct contact with each other? I worry about the lines continuing to blur; the distinction of real life and dream becoming indistinguishable.

Demo for HoloLens.

Then again, I think of the learning curve, adjustments and integration of society’s technological history. For instance, look at the advent of radio. A 1938 broadcast of an adaptation of the classic H.G. Wells The War of the Worlds led to panic and confusion for listeners that didn’t realize warnings of an alien invasion were not real. Upon first seeing moving images of trains in film and on TV, audiences were reported to have jumped out of their seats to avoid what was an illusion of sudden death. Presuming HoloLens takes off, it could represent the next piece of technology to initially astound, but then become incorporated into our routines in a marginal way. After all, the novelty and mystery of innovation wears off. How often are we wowed today by brewing ourselves a cup of coffee? Or driving around town?

My head felt heavy as a brick after researching HoloLens. I closed my tablet once I’d had enough, and looked up at the world around me. Outside my window, Seattle was doing its thing. A torrential downpour soaked the grounds after the sky opened up a short while before. Yet at that moment, the precipitation had passed, and the dandelion glow of the sun mixed with another batch of menacing clouds on the horizon. Then I looked over at my cat, who was snoozing on the loveseat. I smirked and walked toward the couch. The simple perception of real life unfolding in real time was a poignant reminder of what’s really meaningful: to turn off, put away, and disconnect from technological reliances. To look up, out, and around to take note of what is. What truly is.

Technology offers ceaseless wonder that can take our breaths away, but what’s most stunning and wondrous is what’s authentic. At least, that is my belief. As I curled up on the couch, listening to passing rain followed by chirping birds being lured out by the sun, I laughed at my cat who was crawling all over me and drooling (he thinks he’s a dog). He finally got comfortable and we nestled into each other. As he purred, I smiled. There was nowhere else in the world I would have wanted to be. No augmented reality could have heightened the happiness or satisfaction I felt. That is my reality, in the here and now.

Old-fashioned Challenge: Live real life to its fullest.

© Tia Gargiulo, 2015

Survival of the Techiest

While seemingly cold and detached, banking is actually a personal process. Money may be a means for survival, but it also funds lifetime goals such as higher education, travel, and homeownership. Before transitioning to a lockbox role, my Mom worked for years as a bank teller. The image of her alongside a row of other tellers is etched in my memory. They all dressed professionally and greeted customers with a warm smile. Times have changed though.

When I walked into a bank last week to pay my mortgage and cash in some loose coins, I felt akin to a character in a time travel film who is experiencing the future for the first time. I was overwhelmed by the foreign environment. In the lobby (where I stood dumbfounded) there was an ATM, two unmanned bank teller stations, and what looked like two more elaborate ATMs. I assume these were virtual tellers, but couldn’t be certain because there were no signs. The flashing screen savers gave no hint either. On the other side of the bank, there was one vacant enclosed office, and three cubicles for personal tellers. In the first cubicle, an employee sat engrossed in a phone conversation with his back facing the lobby. The second cubicle was empty. In the third cubicle, a man behind a desk talked to a woman sitting across from him.

After a couple minutes of staring helplessly like a deer in headlights, I started to fret. Admittedly, I was too intimidated to approach one of the self-serve kiosks. Yet I was unclear if anyone detected a humanoid in the force field, or for that matter, if the employees working at that branch served the public. Finally the woman from the third cubicle scurried over asking, “Did you have a question?” I sighed in relief as I gave her my payment.

Bask in the warmth of a "live" personal teller.  Photo credit: Dollar Bank.

Bask in the warmth of a “live” personal teller. Photo credit: Dollar Bank.

Before I could mention the loose change, the teller disappeared behind a closed door. Within moments, she reappeared with a receipt. She’d already walked a few steps away before I could utter a word about the coins. “Is there anything else you need?” She asked in a polite, albeit strained tone. Then she softened with the clarification, “I just want to make sure before I go in the back again.” I profusely expressed gratitude and assured her that this was the last of my needs. Once more, she fluttered behind the closed door, returning in a dash with $17.50 in paper bills and quarters. Then she was gone, having returned to her cubicle. During the less than 10 minutes I was in the branch, no other customers came through the door.

The experience took me aback, inspiring reflection on the self-service society we live in today. Are the days of customer service with people helping people a thing of the past? Self-service kiosks are nothing new. ATMs, or “Automated Teller Machines” have been in existence, in fact, since the 1960’s. Original incarnations were limited in capability and very limited in terms of customer acceptance. In the years since ATMs have gone from unwelcome to ubiquitous, self check-out and check-in kiosks rolled out in grocery stores, at the airport, in hotels, even in some fast food and restaurant chains. To use these kiosks, one must go through the effort of traveling to the destination, but even here, the banking industry is breaking barriers.

With innovations in online and mobile banking, customers can take care of many needs from home or on the go, and at all hours. With a few clicks, customers control their own finances by shifting funds, paying bills, ordering checks, even making deposits by snapping a photo of their check using an app. While these advances add convenience, no technology is perfect. Glitches cause malfunctions, cards get stuck in ATMs, and screens freeze causing delays and the need for a system reboot.

My Mom worked behind a counter similar to this one during her days as a bank teller.

My Mom worked behind a counter similar to this one during her days as a bank teller.

Increased acceptance and use of online banking technologies, along with the reduced employment costs of ATMs and virtual tellers have prompted banks to close branches and operate with skeletal staffs. In turn, this has caused layoffs and reduced job opportunities. Remaining employees are left to pick up the pace by cross-covering a multitude of roles. Customers are also left to pay a price when choosing a live teller vs. a machine by paying a “teller fee” at some banks.

For the time being, real people working in banks have not been rendered completely obsolete. There are still tasks that require human interaction and assistance. Whether appealing to older customers, the anti-tech crowd, or those who prefer human interaction to apps, there are banks and credit unions who tout “real tellers” as part of the appeal of their banking experience. There’s no denying that technology and self-service across industries will continue to grow. Survival of the fittest increasingly involves technological prowess. Regardless of one’s opinions or capabilities, it appears as though we’re left with little choice but to adapt or be left behind with an unpaid mortgage and the weight of loose change.

Old-fashioned Challenge: While it’s vital and necessary to adapt to technological advances, don’t forget about the human experience. Emphasize kindness in personal interactions, and value those who still offer customer service while looking us in the eye, not as pixels on a screen.

© Tia Gargiulo, 2015

In Defense of Baritone Oompa-Loompas, Hard of Hearing Pets, and the (Non) Death of Stevie Wonder

My sister and I got into some pretty epic arguments when we were kids. While watching Willy Wonka and the Chocolate Factory, we disputed over whether the voices of the Oompa-Loompas were their own, or that of other actors. I was suspicious while my sister vehemently believed that the Oompa-Loompas could have deep voices. Then there was the debate on whether funnels worn by pets were meant to prevent them from licking/scratching themselves, or whether it assisted the hard of hearing. When Ray Charles passed away, my sister asked if I’d heard the news about Stevie Wonder. Luckily by then, the Internet was both in existence and at our disposal. Our childhood arguments stretched out until either a definitive answer was found, or we got bored and moved on. Whereas in the case of Ray Charles vs. Stevie Wonder, with just a few keystrokes we learned that Wonder was, indeed, alive and singing.

The Internet is a wealth of resources. Think for a moment about how many times a day we go online to assist with questions, both big and small. From clearing up a disagreement, to finding the name of someone or something that’s on the tip of our tongues, to checking on the hours of operation for stores, restaurants or businesses, we can determine the answer in the snap of a finger. In the not-so-distant past, the phrase “look it up” had a very different meaning.

At that time, there were a few options for ending a fight. One could turn to an authority figure, such as a parent or teacher. My sister and I did this when (at 8 and 6 respectively) we called our grandfather to verify if the conjunction “but” had one or two t’s. As my sister whooped in victory, I was left to feel like a butt (two t’s noted). Another option for settling the score pre-Internet was digging through a reference guide, such as an encyclopedia. This added flare to the mix as one could point with supreme confidence to the passage that proved them right. Better yet, one could review the object in dispute. This happened in high school when I got defensive about the song “Dream On.” After hearing an older, raspier Steven Tyler sing the song, I’d erroneously thought it to be a cover of another band with a smoother sounding vocalist. After a heated exchange, my friend proceeded to put on the original 1973 album. As he handed over the record sleeve to review, he asked, “So who’s singing?” Even more biting than conceding that I was wrong about Steven Tyler covering… Steven Tyler, was seeing that the track came from Aerosmith’s self-titled debut.

Naturally baritone or naturally lip synced?  The great Oompa-Loompa argument.

Naturally baritone or naturally lip synced? The great Oompa-Loompa debate.

Reflecting on these old battles, a few observations occur to me: 1. I’ve had more than a couple of arguments I’m not proud of. 2. Not having answers immediately available created discourse and debate. 3. Finding the answer was part of a process. Whether seeking an authority, researching the answer, or reviewing the source itself, there was actual legwork involved. 4. This made the resolution of an argument the conclusion of an experience, thus making the lesson just as memorable as the events that led to the answer. The embarrassment I endured when proven wrong (or the justice I felt when proven right) is probably part of the reason I still remember those arguments. Yet, I also remember the correct answers from all those obscure, inane points of trivia.

With the Internet, a point of disagreement can be settled in a moment and without ever leaving one’s seat. Voice command search engines such as “OK Google” and “Siri” make it so that we don’t even need to type our query! This allows for learning on the fly, which is incredibly convenient and useful. However, what of the process of learning?  Has it been tossed by the wayside? Without the passionate squabbles or the legwork of research, do we still retain the information we reference online?

Studies vary on how online learning vs. traditional methods of learning impact memory, although it has been suggested that spatial landmarks from physical textbooks may better assist one’s recall than e-readers or mobile screens. For as often as I rely on the Internet to answer daily questions, I just as easily forget what I discover. When I read The Andy Warhol Diaries, for example, I leveraged the Internet on my phone at every step to learn more about the celebrities, politicians, fellow artists, news and events that Warhol mentions. While I learned so much in the moment, the knowledge was aqueous and flowed out of my mind as soon as the next point of reference came up.

Stevie Wonder: /NOT/ dead.

Stevie Wonder: /NOT/ dead.

The information available online is vast, but there is a real danger in accepting inaccurate information, or a subjective opinion cleverly disguised as fact. Even popular reference sites, such as Wikipedia, can fall prey to unverified, unreliable data. A friend who used to work as a teacher said she would always advise her students to “follow the citations” to verify accuracy. Of course, the spread of inaccuracies existed long before the Internet. Figures of authority, books, points of reference and research all have fallacies. Perhaps the Internet is only guilty of the volume and ease of access to bad information. At the same time, this threat could aid in making us better students by adding to our sense of skepticism. In turn, this encourages motivation to sort through all channels of information in pursuit of verifiable, reliable, correct information.

It’s fun to imagine what kinds of disagreements my niece and nephew are bound to get into. I wonder if they’ll refer to our family for answers, or if from an early age they’ll bypass us in preference for technology. I can easily envision a future where my family will rely on their technological prowess to assist us. At least I can impart some pearls of wisdom in how to be gracious in the event of being proven wrong. After all, I have years of experience in that realm.

Old-fashioned Challenge: Use the Internet as a starting point for research. While it’s superior as a method for quick reference, delve deeper by taking a class, checking out books, or watching documentaries.

© Tia Gargiulo, 2015