Week 11: Beyond
After watching the videos in the beyond section, post your cross curricular outlines below.
Posted in: General, Week 11:After watching the videos in the beyond section, post your cross curricular outlines below.
Posted in: General, Week 11:teacherben, melissaayers, Jenny Brown and 9 others are discussing. Toggle Comments
You must be logged in to post a comment.
Activities: 1. Record your experiences with Flutter below. 2. Present any examples or links that you have seen or ideas you may have generated where education could benefit from this quickly emerging technology.
Continue reading Week 11: Gesture Posted in: General, Week 11:melissaayers, teacherben, Eva Ziemsen and 18 others are discussing. Toggle Comments
1. My initial experience with Flutter was a positive one. Upon downloading the software, it is really user friendly and easy to use. However, when I used it with my music library in iTunes, I realized that it does take a while to learn all the commands. Furthermore, you have to frequently repeat the gestures as the camera doesn’t always pick them up (this could be a result of camera angle). Another limitation is that you need to be within 1 to 6 feet of your computer and I am beginning to wonder whether this will truly take off given this limitation. There seems to be a few wrinkles that need to be ironed out and I guess this is why the NMC Horizon report has suggested a few years before its successful inception.
2. Upon my initial experimentation with this technology, I could see it being used in education in a myriad of ways. Apart from simply using gestures for controlling your PC/Mac, I think the gesture recognition aspect of the software could prove to be very helpful in education. It would be great to teach sign language in which it could perhaps decipher whether the students are learning the correct methods. I also think it could have a place in courses such as Drama and Physical Education in which body movement represents a huge portion of the curriculum. Although I don’t have specifics right now, I look forward to seeing what our cohort comes up with and adding more specific examples throughout our discussion this week.
I totally agree with everything you said Manny, but what instantaneously came to mind for me was the idea of collaboartion in a classroom with a large output. Students would be able to contribute to class learning more actively and everyone could see the contributions as they’re recognized and displayed. Students don’t have to come to the IWB to write.
While helping specific individuals/groups, the teacher could show the entire class soemthing that is worth mentioning, from their present location instead of from the front of the class.
The possibilities are really endless, and it’s innovative technology like this that makes me miss that I’m not in a classroom anymore.
Hi Suhayl,
Like you and Manny, I too believe gesture technology can be used for a variety of subjects and collaboration among students. Building on those ideas though, I also think it could be a great way to make a classroom that much more inclusive wherein students with physical or learning disabilities can use it to participate alongside their classmates.
I myself did not have such great luck with this program, however that is besides the point. I like many of you, envision this technology to help many students, either at home or at school learning various activities. The original idea presented was sign language, however, imagine a coaching program embedded with Gesture Technology for Dance Classes, Hockey Practices, etc. Yes I realize that major sports utilize similar technologies, but would not Gesture Technology allow an affordable option for schools and potentially the home.
As well, if we extrapolate this technology say inside of a pencil. I could imagine the new Pencil 2.0 (combination of gesture technology, and maybe a light signal mechanism) as being a great resource learning how to print for young students, to even handwriting for older students.
In a traditional learning environment, I believe some thinking would have to occur. In regards to the physical layout of the class, line of sights to students, proximity between students (for I would not want someone gesturing and accidentally hitting a fellow classmate).
Lastly, I would be interested in how this technology could be used on a regular basis to combat health problems, such as excess weight and the issues associated with that.
Thoughts?
Hi Manny:
Great points that you have made about Flutter and gesture. I wonder though with the 1-6 foot distances that you mention as limitations to Flutter, if they are reasonable for an App on a mobile device. How often would you be more than 6 feet from your iPhone? I can see six feet as a limitation for larger applications such as were given as examples by the larger players.
Doug.
I think possibly, the 1-6 foot limitation coule be an issue for people who want to sit on their couch and use the computer to watch movies etc. I have a friend who uses his TV as the screen and sits on the couch with a wireless keyboard and mouse and flutter seems to be the next evolution in that direction.
Even before trying flutter I am so impressed by the potential of this technology I will like to share my thoughts here.
I believe that this along with voice have the greatest potential of enriching the learners experience as far as NUIs are concerned. I consider gesture based computing a form of “extended reality” in that the learner is able to interact with a computer based environment in an almost tactile way. As such the only thing between the user and the environment is the display screen however it “transparent” to the experience and the user for all useful purposes is a part of the environment in the computer. A learner is in fact in the matrix 😉 and just like Neo in the hit movie series is now empowered to do thing that is not possible and that has limitless potential as far as education is concerned. The keystone of this technology is the new level interactivity it brings to the teaching/learning experience. This can enable one to passively explore a CG environment by walking and navigating the landscapes and architecture there. This could be anything from an ancient world to an alien civilization, through the body of an organism or some man made technology. Like Neo the power of this technology is that is that one can not only see but they are now able to affect the world in such a way that the world in turn is able to affect them in a profound way. In other words their interaction with the environment provides learning opportunities that is not possible by any other means.
BTW why is it that game developers are some of the first ones to exploit this technology?
Flutter was a lot of fun! Totally enjoyed the experience. I’ve been eagerly looking around for people to show this to. The only problem with this great program is that it has just a few gestures. Doesn’t it give you a craving for more gestures?
To answer/add on to Patason’s response: I think game developers are putting a lot of research and development into this technology because their is a financial reward to it. Their only requirement is that it has to be entertaining. This is much different than education sector which is responsible for making the technology engaging and educational. Two hurdles compared to the video game industry.
I remember a while back when there was a video game that first had this technology in a Street Fighter/Mortal Kombat type of a game that allowed players to kick and punch and the character would respond to their physical actions.
Now we are into the later stages where the Kinect is doing this in homes.
Kinect’s obvious implications is the promise of teaching dance in a natural way. Of course you are relying on the input device of the camera to read when you are a successful and unsuccessful dancer which may not be the greatest judge but it still has an opportunity to flourish.
In the classroom — i see the physical nature being great for students that need to move and interact to learn. Amazing to think of a great big display that students can gesture and interact with for information.
Yes Jonathan I agree that that game developers are prepared to invest in their game because of the financial rewards it can bring. However don’t you think that the kid who is willing to plunk out ????$ for a video game will be willing to do the same for a lesson, unit or module that leverages video game technology. As and example to this I remember downloading and playing wolf quest (http://www.wolfquest.org/) for another MET course. In this game you are wolf who has to survive, find a mate and raise a family in a semi-realistic real world environment that essentially teaches ecology from gaming standpoint. All of my children(3) and nieces and nephews were all hooked on the game before that summer was out. They saw this game in the light as any other video game and am sure they would prefer learning about ecology this way as against the traditional textbook mode.
That being said I agree with you that the video game industry is different to that of education but I believe the difference the education is not seen as a for profit endeavor by educators. The textbook industry is the biggest exception to this. In addition in my opinion we are still grappling with how to leverage new affordances such as gesture in our ‘game’ while gamers are light years ahead of us in this respect. In essence we educators have to start to see education as more than just a prerequisite for the members of a society but as money making enterprise in its own right.
For those people with an interest in programming, i just discovered a library called OpenCV that works with Processing (and most major languages and platforms) that allows you to use input from your camera in your programs. It focuses on what is called ‘real time image processing’ which uses live input through a camera to do stuff. It can do things like compare one frame to the next. So, for example, if it sees the same background (no one there) for a long time, then suddenly sees a big change in the colour of the pixels (because someone walks into the frame) then this could be used to trigger an event, like music playing, or a message being put on a screen or power being turned on for some other device. Hopefully I will get a chance to play around with it this week but it looks promising for yet another option for some DIY gesture fun.
Using Flutter was interesting but as it has already been mentioned, it takes awhile to learn all of the functions. I have used Kinect before and unfortunately the gesture commands were different. In Kinect the ‘go left’ function can be completed by a swipe not a thumb pointing in the direction like in Flutter.
If gesture software were to become mainstream I think it would be important that some standards are followed, similar to how ‘x’ always means exit in a window.
Like most touch/gesture technology I think there is a huge opportunity to leverage this capability for visually impaired or otherwise challenged students. To use on a large scale in a regular classroom however, one would have to consider design and the value that is added. I think gesture based technology will make certain tasks easier but not necessarily better for learning on a large scale. So once again it comes back to cost.
I think there’s no question that it will become mainstream and costs are already coming down. As you saw with Flutter, it can already be done with a single webcam, although it uses a fairly rudimentary system of what is called ‘blob tracking’ which is based on the computer recognizing ‘blobs’, blocks of darker pixels on the screen, and then compares one frame with the next to see if the blob moved. But to do more advanced tracking of movements, you just need two cameras. The cameras on our laptops, when purchased in bulk, cost about two dollars (can you believe it?) So it won’t be a big deal for the to install two of them on a laptop. Look out for the upcoming Microsoft Xbox Surface gaming tablet that will certainly set a new trend with gesture-based input. (We will probably see a proliferation of gaming tablets for a while, so companies can try to make money on two fronts, but then sooner or later, there will be a convergence and all tablets will be equipped with this sort of stuff.)
In the meantime, you can pre-order the Leap, which has similar functionality to the Kinect, but is supposedly 200 times more sensitive, works on multiple platforms and will only cost $70USD. It comes out in a couple of months:
I could not use flutter because it doesn’t work with AMD processors
I also found this to be a very interesting program but struggled with it initially. Like any new technological feature, it took me some time to learn now to use it properly. I actually could not even get it to work on my computer initially and I began to become very frustrated because it looked so easy in the videos.
I think this type of tool could creates many opportunities in an educational setting. The very first thing that came to my mind was special needs students who do not have the manual dexterity to work with keyboards and touch screens. This type of tool can open up so many doors for individuals who have such disabilities. Thus, allowing for them to become more active in their learning. I also think this type of tool is great for group work and collaboration.
One concern I had with this is that gestures have different meaning in different cultures. Hence, a technology like this may have different gestures in different areas and my not be able to be applied cross culturally. For example, some cultures may find certain gestures to be rude or they may have an inappropriate meaning or some cultures may use a gesture for a different purpose in everyday life and may find it difficult to adapt to the one used in software such as this. Thus, resulting in gestures having to be changed for different societies.
Nureen
Late to the discussion party, and not much new I have to offer but agree that gesture has great potential in computing – for special needs students, phys ed, etc. Flutter was a neat little experiment that gave a simple taste of the great potential.
As others mentioned it take a few minutes to get used to using Flutter. I did not get a chance yet to try it with Quicktime or Keynote. I think those might be areas that would be great for everyday use and in presentations.
One area that I think this type of technology could really have an impact is in coaching. It would be great to have a player take a slap shot, pitch a curve ball, or shoot a 3-pointer with a system monitoring their movements. It could then be used to analyze their form, offer suggestions for maximum efficiency and correct technique. I could see golfers to be the first to really adapt to this since there are already so many golf simulation games and places to “drive” a ball indoors. As a coach I would love to be able to use this technology.
One area where gesture-based computing could provide educational benefits is in the realm of virtual field trips. Some virtual field trips such as those set up with a museum are designed to feel like walking through a museum. It would help with the immersive feeling for students to be able to use their body to navigate around the museum like they would if they were actually there. Using an app such as LookBackMaps could allow students to intuitively navigate around Google maps and locate historical sites and research information.
Jhodi
I tried out Flutter and really enjoyed it. Not a gamer myself, I think I found its novelty more exciting than anything else. I also found it took awhile to learn the gestures and don’t really find it any easier to use than clicking the mouse. The fact that you need to be within a short distance range from the computer is a limitation when using Flutter with iTunes. Thinking of the scenario of a party, I personally would find an remote easier to use to switch songs than I would gestures. I like the affordances it provides for people with unique needs such as the woman who had the stroke. In terms of use in an educational environment, I think it there is great possibility here. I have never liked that the Smartboard allows only one child/person at a time to move or change the screen. Gesturing might improve this technology to engage more learning in action.
I also couldn’t get it to work with my “at home” systems (and I have no music/videos on my computer). Alas! However, I get the gist of it. I like the point of it being useful for teachers who could access the board from any point in the room. Also, allowing for mulitple contributions would give some great opportunities. What if students could use it to collaboratively build models from different eras? Like a really educational/interactive version of Age of Empires. I can see this eliminating a lot of technology as well (which is more to the landfill), but if I’m correct, this could move us towards less “stuff”? Is there any information in this field for environmental impacts?
I downloaded flutter and currently using it with my iTunes library. I am having no difficulty with the hand gestures, you just have to ensure your hand is being captured by the camera and not moving too fast.
The videos capturing how gesture applications can be utilized for communication with stroke victims, individuals with syndromes and the elderly are brillant and I would anticipate that this area will continue to advance in the future.
Thanks of the Learning.
Catherine
Hey All,
I wasn’t able to use Flutter, but I have used Kinect to play games such as bowling and target shooting practice. Gesture technology is truly amazing. I can see the benefits of this technology in an entertainment aspect (such as gaming or turning on your TV and browsing through the channels), to helping people with disabilities. On the educational front, not only do I see the benefit of being able to project or move things around without having to physically use a computer or mouse, but also the ability gesture technology has for teachers to create interactive assignments for students. I think it would be relevant in science classes where students would be able to use gesture-based computing to do experiments or dissections. Students would be able to do the actual procedures of cutting into or opening up a specimen (if having a real specimen was not possible). They could also rotate the object around to see the specimen from all angles. I could also see this technology being used in a physics class.
This technology allows students to be actively involved in their learning, instead of passively receiving information. It lends itself towards student-centered activities, therefore taking the focus off of the teacher and onto what is trying to be taught and learned. Students can explore, simulate situations, find answers, and collaborate with others. But as with always, if this technology is introduced into a classroom, teachers must make a shift on how they teach that leads their students to this type of active learning.
Lisa
I do not have access to a computer that I can try out Flutter on, but I did want to mention some advantages I have come across with gesture technology. In both my elementary Life Skills classroom and my Profound Mentally Handicapped High School classroom, we often used the Nintendo Wii system for adaptive Physical Education. Regardless of mental ability to understand the game or physical capability (most PMH students needed hand over hand assistance), all students were exposed to the idea of cause and effect. (Their own physical actions were creating something “fun” to happen on the screen.) As I discussed in the touch screen forum, it was hard to determine if the PMH students actually made this connection, but the life skills students certainly did and had a ball. (It also helped that we created Mii avatars for each one of them. They were so excited to see cartoons of themselves and took such pride in making themselves to things on the screen.) Since a big part of our Life Skills curriculum was working on gross motor skills, Wii “play” fit in perfectly. Now with advancements, like the Kinect system, making remotes obsolete, this technology will only continue to enhance students’ experiences and access to interactive activities.
Thanks everyone for contributing with your experiences with flutter and for your ideas of how gesture based technology can be used in education. Here is a summary of your observations and comments to date. Fell free to keep commenting to end of day Sunday.
In summary, flutter was a great introductory experience using gesture technology. According to the gesture poll, five of you have at least one gesture based app on a device and 14 have none. In general, your observations were positive that the program worked and has potential. some of the problems were: recognition of limited gestures, it takes time to learn functions and sometimes it did not recognize gestures. Also, several people mentioned that they did not have the technology to support the App. Advisability is always an issue with newer technology.
Here is a list of suggestions you provided where you think gesture could be used in education: sign language, drama, physical education, science – biology, dance, coaching and sports – hockey, golf, ect, health education, ecology, and computer programming, printing and writing. I am sure I missed a few. Several mentioned how gesture could be used to improve education by increasing collaboration and interaction, virtual field trips, improved interactivity of SMART Boards.
Many of you saw a real need to use gesture based computing to help students, adults and the elderly with disabilities (mental, physical visual), syndromes by allowing them to be included in classroom and everyday activities.
For me, I learned tons about gesture, voice and touch while working on this module with my mentors Ben, Stuart, Frank and Joel. It was almost embarrassing how little I knew about these technologies when I started this project. Now like after going through the other modules in this course I am much more knowledgeable and a little wiser.
Doug.
I was unable to use Flutter due to (once again) my OS being out of date and still not having an update. However, after reading more about it, I was completely sold by this technology. I believe it would be highly useful, as many have mentioned, for those with special needs and also the elderly. I started to envision what would happen if you combine gesture technology with virtual reality. I believe York University has a lab wherein you exist and gesture and you are also in a virtual reality, however, I do not have details on this. I am sure gesture will become part of the norm and will eventually replace many input devices (such as a mouse).
One other idea I had related to gesture and my field, filmmaking, is that it would be great to incorporate gesture recognition into cameras. For example, when filming a documentary, most often the subject moves in ways you cannot predict. Of course there are already things like auto focus or exposure, but perhaps cameras will be able to respond to the director’s hand while the filming is taking place. What is often done is that the director and camera operator have a code (similar to baseball) and 1 means, close up, 2 means medium shot and 3 means wide shot. They have a silent secret code, because they do not want to interrupt the flow of the interview. If the director could gesture this with their hand, and a device near the director’s hand could read this, it would result in a more streamlined mode of working. Of course, it will also lead to machine errors, but it should be interesting where this takes us.
Here’s a cool new thing.. This company has found a way to use acoustic sensors to recognize the different sound signatures of different touches–so your cell phone would know the different if you were touching the screen by sliding a finger or sliding a knuckle across the screen. It could tell the different between different paterials, so you could have a stylus with a ‘pen’ end and an ‘eraser’ end.
http://www.engadget.com/2012/11/18/qeexos-fingersense-lets-touchscreens-listen/
Like others mentioned I also find the best application of gesture is for students with disabilities. I like flutter and think its great on a PC/laptop for finding and/or closing applications, increasing/decreasing volume and iterating though music tracks etc.
I echo the concerns of others about the cultural and usability significance of gestures. I agree with their being a need to come up with a set of standard gestures and their meanings. This will make applications much easier to use as does it now when web designers adhere to defacto standards such as using blue for hyperlinks etc, search box placement and home buttons/links on websites.
After reading the Touch page and watching the videos, please answer the following 3 questions: 1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers? 2. What do you think this innovation might lead to (opportunities) in how we provide and […]
Continue reading Week 11: Touch Posted in: General, Week 11:jameschen, Eva Ziemsen, jenniferschubertubc and 24 others are discussing. Toggle Comments
1. Professionally no – as a school just beginning to explore BYOD technology, and before that having little to no technology inside of the classroom, this reality has never existed yet. Personally, I have watched my now 4 and 6 year old navigate extremely well throughout my iPad to play various games and participate in multiple learning activities. (Which on a side note, I do not believe my children are naturally gifted at technology, as many people do… what I do believe is that the people who invented the interface has made it so natural and easy to figure out, that age is almost now a non-factor). Lastly, due to my location in the world, I have not experienced geographical nor socio-economic improvements due to touch technology, I have heard of the extremely inexpensive tablet computer before, and I am interested. However, I am concerned about the longevity of the device, which may lead to more disposable tech… Thoughts?
2/3. For myself, these questions are very similar, for myself, it is very similar to the touch screen desk that appeared in one of the TED Talk videos. For I envision, desks such as these for each student, containing textbooks, audio files, etc… Which would allow the promise of interactive education to occur for each individual student… Thoughts?
Secondly, even though Apple made Touch technology common for the consumer market and profitable for themselves, it is important to remember the many milestones that have occurred before:
http://www.npr.org/2011/12/23/144185699/timeline-a-history-of-touch-screen-technology
I myself remember some of these… 🙂
Thanks Week 11 for a great start on some interesting topics – I’ve just started experimenting with some of the voice apps but want to jump to Touch technology.
Interesting timeline Tom. It’s always important to remember that even with the rapid leaps in technology, there are still small steps along the way. I found one key step absent in the timeline link you posted – interactive white boards (e.g. smartboards). While these companies likely didn’t create many new touch technologies, they certainly made huge inroads into education. But it took many years – I remember seeing SMART Board vendors trying to sell their wares many many years before they have just suddenly become very common in my own school division. Alas, most schools spent thousands on single touch boards, and now of course multi-touch is the standard.
While I think IWB (Interactive White Boards) have great potential, I’ve seen too much money spent recently on them just because schools see them as a “must have” to demonstrate they have “integrated” technology. Already, many of them are almost obsolete due to multi-touch. And iPads.
My point I guess – by the time many schools are ready to really invest in a technology, it is already being surpassed by something better. For expensive technology – this is a major consideration. Schools need to invest – but is big and expensive the best option if you are late to join the game?
Hey Peggy, thanks for the post.
I am getting the feeling from your post that that a school is a large entity that is slow to change and adopt new technology. When it does change it has to go for proven technology that might be already surpassed due to accountability and costs.
Do you think that a school should be broken up into classes and let teachers decide their use of technology. This would allow for early adopters and trials of the newest digital resources due to the reduced volume and faster adaptability?
Stuart
Personally, I think that would be a great idea. However, I am unsure a district would allow such approaches, as volume purchasing reduces prices, and consistent technology reduces IT time and subsequent costs.
However, as a classroom teacher, I would fully endorse such an approach.
Thanks for sharing some of your reservations regarding white boards Peggy. Unfortunately, when technology (or anything new for that matter) is introduced in the classroom it usually undergoes a pilot phase for a period of about 2 years. I am currently part of a pilot project integrating iPads within our school district. However, according to Moores Law, by the time it is ready for full implementation, it would probably already become obsolete. Sort of a catch 22 but something that we must live with. I guess the alternative is to do nothing at all and i’m sure we would all agree in that avenue is not an option if we want to remain on the cutting edge of innovation.
Thanks for the extended discussion. For a large division such as mine, where tech support can be an issue, allowing every teacher, or even every school, to go in their own direction, with their own technologies, is a poor option these days, as desirable as it might initially seem for individual schools.
I agree pilot projects are likely the best approach, but as you suggest Manny & others, by the time to pilot is done technology has moved on (iPad 1 > 2 >3). But iPad versions, in the end, are less critical than the iPad itself.
To me the lesson is then …
(1) Make sure the technology is not just a flash-in-the-pan that will be totally obsolute within a year or 2; thus it needs to have some established life span already;
(2) realize that having the most recent version (e.g. iPads) is not the important critical factor. Tailor instructional technology implementation for the lesser version of the technology, once the technology itself has been determined to be worth the investment.
I’m sure there are more points to add – suggestions? Alterations to points 1 & 2?
I think it would be ideal to let teachers choose what technologies they implement in a classroom. I understand the disadvantage to servicing and purchasing of equipment but it would allow for a lot more innovation. When a teacher has a choice in what technology they are using it makes them feel empowered and they are more likely to be effective using it instead of just being assigned a smart-board. Different styles of teaching does lend itself well to different kinds of technology. Technology is just a tool that needs to be wielded correctly by the user for a specific purpose.
The other point is that educational technology is dependent on teachers to evaluate their usefulness. If teachers or other users of educational technologies don’t find that it is useful then they just don’t use it. The result is that technological advances will focus only on what is successful. This evaluation process gives direction and focus to further research. That is why I think the decision should lie with the teacher and not the school district. I often find that the people buying the technology often buy into the marketing pitch instead of what is really needed.
If I may, I think that’s a great question. All too often, we are so stringent on rule and bureaucratic guidelines and processes that by the time we get what we want, or a decision it’s too late and the technology is obsolete. I have experienced this first hand with many forms of technology, and recently with the iPad. By the time the decision was made to go ahead and allow for the purchase for the iPad2, the iPad three was already out. But, there were lessons learned for our organization. We are less strict on what can be purchased. We try to focus now on how the technology requested will meet the school development plans (SDP), whether it’s hardware or software. If there is a need for a specific form of technology and it meets the SDP, then a simple Privacy Impact Assessment form is filled out, and it’s approved rather quickly. I think this will allow for more time with the current technology instead of being technological laggards.
Thoughts?
Thats great Patel, so you are saying that schools can act independently according to their own SDP?
Does this mean the district has no control over what the school spends as long as they are in their SDP?
Hi Tom, thanks for sharing.
Though Apple has become the corporate faceplate of Touch in recent times, you are certainly right to point out that its history goes back way deeper, and in that regard, Apple has just been a good entrepreneur of bringing this technology to mainstream market.
Perhaps a contemporary version of the Electronic Sackbut that first incorporated touch might be the Hydraulophone – a touch operated acoustic instrument that operates hydraulically.
https://www.youtube.com/watch?v=tgU0OZkGhGI
I agree with you that with cheaper versions, longevity will likely certainly be an issue as, with any replica type manufactured product. But I think if we take a long view on this, then through iteration, those cheaper manufacturers of tablets will also learn how to improve the quality of their products, just as anyone else would. The goal of a creating a $20 tablet is a noble one, but no one said it would be an one to achieve, and certainly, not a static one across time.
As for the future of touch, I wanted to focus more on the fundamentals of what I see to be the most important attributes of Touch. However, take a look at this video of MS LABS’ Vision for 2019, in which Touch: Touch helps us improve cross-cultural communication, learn visually in ways previously not possible, manage and navigate our lives in smarter, simpler and more convenient ways, and even shift our perspectives across space and time. The future of touch is ripe with opportunity; where do you think these opportunities lie for the field of learning and education?
First off, thank you for sharing this video, it shows what I currently believe, in that Touch technology is most likely the next evolution of the interface with technology. much like the mouse was before it. In that, like you said, it allows us to potentially navigate our lives, in smarter, simpler ways. For like the video you shared, touch technology is only as powerful as the accompanying gestures provided, and this is where I myself have some difficulties imagining the future.
For you mention, that touch technology (which must include gestures to utilize the technology), will improve cross-cultural communication. But whose culture are we developing the touch/gesture combo’s on? Will all gestures be western in origin? Take this website for example:
http://westsidetoastmasters.com/resources/book_of_body_language/chap5.html
It is vital, that we consider that the entire world does not view our gestures the same way we do, nor is it our right to force our Western gestures to the rest of the world. Something, I know you are not saying, nor implying, but it is something that we sometimes forget – as was evidenced in the One Laptop Per Child Program – heavy western influence and assumptions, led to issues with the program.
Lastly, I will focus on the point you made in regards to using touch to learn visually. While at first glance, this may seem opposite, upon reflection I was reminded of the Minority Report Scene:
And yes, like you, I believe this technology, with consideration and careful thought could enhance areas of education. I wonder though… Even if it may help, could anyone in the near future afford classrooms full of touch desks?
Thoughts?
I am not sure who the ‘we’ is that is forcing anything on anyone. As noted above, there are a few big names in the game such as Apple and Google, but this is not the 1990’s. There is hardware being developed in India by Indian companies and hardware being developed in China by Chinese companies. if you are willing to spend a weekend learning how to do it, you can produce your own interactive hardware in your garage for a few dollars. The tools are easily available for anyone to learn to write their own software that leverages these technologies. The process has been democratised in a big way. I can write software to share with just my own classes with no intention of ever sharing it with the outside world and i can do this without even having to write a single line of code. And the tools I can use to do this are free.
While agree that the world of technology is opening up, and the examples you provide for hardware being developed is becoming and will continue to become a more global venture. And lastly, I also agree that it is not the 1990’s.
My concern is fundamentally the operating system in which the hardware runs, more specifically, the large companies you mentioned like Apple, Google, and even Microsoft etc… is “What are they basing their touch/gesture technology on?” Are they going to create an industry standard like the WHMIS system? Which is the same around the world, regardless of language or cultural norms. If they do that, what touch/gestures are they going to base it on? For in my above post, I showed how even common gestures have multiple messages around the world. My guess is, the default would be western, as these are primarily Western Companies entrenched in western values and norms. I myself have an issue with that.
Or, it could be approached like you mentioned, with various individuals and companies creating what they want, how they want. The benefit here would be the customization of products to local areas, the downside is the cost to mass produce this. And yes I am not talking about the hobbyist in his garage, I am talking about large multinational corporations who like to earn profits. For it is these companies that for at least the near future will drive the majority of this industry.
Regardless of this concern, I do see an overall educational benefit of this technology, especially for the young and elderly who have either developing or deteriorating manual dexterity.
Thoughts?
I understand your concerns and I think there’s no question that western sentiments seem to dominate the industry but I still think that you may be underestimating the ease with which companies can and will be able to customize the experience for their own users. With Android, Google provides a core which is itself based on a lot of other people’s technologies (the kernel is Linux-based, for example) but the user interface is highly customizable as evidenced by the many variations that we see with Sense on the HTC phones (a Taiwanese company), Touchwiz (Motorola) and more. Here is a list of 42 launchers that have been created to replace the stock Android one, each one replete with its own library of gestures:
http://en.wikipedia.org/wiki/List_of_Android_Launchers
A couple of years ago, there was a group of Chinese hackers (white hat) that had been mucking around customizing their own Android rom (custom firmware) for their own purposes. They borrowed ideas from a bunch of different phones and added a lot of their own stuff. They made this custom ROM available for download, so other people could replace the operating system on their phone with this modified Android one. It uses a lot of unique gestures to control it and has a lot of unique characteristics. It became very popular in the Android community and since they made their code open source, a lot of other people started customizing it for their own market/language as well. Their ROM is called MIUI. Their company is called Xiaomi. They managed to get some backers and scraped up enough money to get started making their own phone. A year later, they had sold half a million phones. Then they made a deal with China Unicom and sold another million phones.
If you picked up one of these phones, you might not recognize it as Android at all. They replaced the existing voice recognition software on it with one made by a Chinese company that works better with the Chinese languages. They included software so that you can draw Chinese characters with your finger to launch applications. Not bad for a few guys that started in the garage.
The programming involved to replace one gesture with a different one is not a whole lot more difficult than mapping a different combination of keys on a keyboard.
Thanks for the link to body language. I agree not everyone uses the same gestures, and I also feel there’s a certain westernization of education. For example, though I found Sugata Mitra’s experiment ‘Hole in the Wall’ interesting, I did not approve of when he responded that the programs did not need to be translated into Hindi, because children learned how to use them intuitively. Since the end of the British colonies, English imposed itself as ‘the’ language to communicate in, and myths emerged about learning in an ‘English Only’ environment and preferably with ‘native speakers’; all imperialistic notions. Unfortunately, this is now the case with technology and the WWW; most is in English. So though these technologies afford many things when it comes to education, I also think we should question their effect on minorities, cultures and other language systems.
Oops.
Here’s the Microsoft Office Labs vision 2019:
https://www.youtube.com/watch?v=8Ff7SzP4gfg
1) Through my experience, I have to look no further than my 3 year old daughter and her interaction with touch technologies to see its overwhelming impact. An important observation that I made is similar to Sugata Mitras observation on people living in remote areas. In this instance, if you observe an infant utilizing touch technology, an interesting phenomenon occurs. They begin to understand the cause and effect relationship and similar to the boy experimenting with “the hole in the wall,” they realize that their hand gestures can control the occurrences on a screen. What became fascinating to me is how this technology aided in my child’s verbal communication. Of course as parents we model the language for them but coupling it with technology takes it to another level of comprehension. An important note to be taken away is that touch technologies don’t discriminate in the users ability to use them.
2) The beauty of the NUI technology to its GUI and CLI counterparts is the ease at which it can be learned. For this style of touch technology, the learning curve is quick as was demonstrated in the Sugata Mitra video. In essence, this provides an even playing field as students can concentrate on the content and not the procedure of operating a computer.
3) The primary way for districts to facilitate the innovations provided through touch technologies is simply to embrace the hardware that affords it. Tablets are making their way into mainstream education and the hardware seems to be popping up more frequently. When implementing new innovations such as this one, cost usually becomes one of the main counter arguments/concerns. I believe that this is where the BYOD initiative comes into play and helps offset some of the fiscal concerns around integrating touch technologies.
Like others who have posted, I have witnessed touch screens being used by small children who seem to quickly learn and master have to navigate various platforms as they become intuitive. I like the point made in the OER that we have been making learners adapt to fit with technology instead of having technology fit with learners. I would hope that the future of education would include the use of more intuitive technology with gentle learning curves to provide ease of use to everyone regardless of their experience or background. Will we ever get there? I believe so. In my lifetime? Maybe with higher education and organizations, but in the public education system and remote areas of the world, it will be very difficult. The MET video mentioned that it would be ideal to use technology in less fortunate areas before well-developed ones. Of course there are many issues with this, the big ones being funding and access.
Haven’t we always had to adapt to technology? From pencils to spears, there has only been so much that we could do to customise our user experience. I would venture that the purpose of education is a combination of teaching students both how to adapt technology to suit their needs and to adapt themselves to suit the technology available.
1. I think one of the huge advantages to touch is the improvement of learning and communication opportunities of mentally and/or physically challenged children and adults. I thought this article: http://articles.latimes.com/2012/sep/05/news/la-heb-ipod-touch-autism-20120904 provided a good example of how an iPod touch really helped an adult with Down syndrome, who could not read, tell time or understand a calendar, keep her job. Another article showed how a young autistic student is now able to write for the first time using iPad’s touch-screen and how a school is using iPod Touch for each of their autistic students. http://www.nj.com/news/index.ssf/2011/01/apple_ipad_itouch_may_help_peo.html
2. Children are not as assimilated as adults and therefore are better explorers. Touch would (and already does) lead to more learning opportunities for even younger children. As many of you have probably experienced (and some have already mentioned), even two year olds can become pretty good at using touch pads; it was my friend’s 2 year old daughter who showed me how to change the views on the Magic Piano app.
3. I think touch technology will greatly impact the types of tools purchased by schools – more iPads/tablets and less PCs. Schools need to stay ahead of the technologies (taking this course, reading the Horizon reports and being part of Ed tech communities) so that they can lead and facilitate the innovations that touch can provide by hopefully getting the tools in place while they are still current.
I think Jenny has touched on, pardon the obvious pun, what is undoubtedly one of the truly important benefits of using touch and gesture based NUIs – it’s affordances for mentally and physically challenged learners.
To highlight this point, just last night on CBC Radio’s As It Happens, they interviewed two teachers from the Toronto area, that worked with a researcher from OISE/UT, to investigate the effectiveness of iPads as a communication and instructional aid for students with Autism. The gesture based NUI of the iPad proved to be an effective method of ‘reaching’ the students, which allowed for more sustained opportunities for academic and social instruction in the classroom. The efforts of the two teachers earned them both Prime Minister’s Awards for Teaching Excellence. To learn more about this research, you may want to listen to the complete interview on CBC’s podcast found here: http://podcast.cbc.ca/mp3/podcasts/asithappens_20121113_25689.mp3
I believe many of us feel we have lived long enough with input technologies from centuries past. Heck, the clumsy QWERTY keyboard layout we all use today, was invented in the 1800’s with the goal of avoiding jamming manual typewriters when adjacent letters were struck! Touch technology in general is a much more intuitive way to interface with digital devices and when combined with the emerging technology of haptic* feedback, touch based interfaces will continue to revolutionize how we interact with technology.
Personally, I can’t wait for a touch screen iMac, which by all accounts is already sitting on a desk in Sir ‘Jony’ Ive’s design lab in Cupertino!
* For more information on the emergence of haptic technology in education, check out these links:
http://news.vanderbilt.edu/2012/03/haptic-tablet/
http://www.bristol.ac.uk/news/2012/8821.html
http://tinyurl.com/a8nf768
As we were beginning to research this area this was an area that stood out for us too. We thought it was so important that, for a while, we were considering focusing the entire week’s discussion on Accessibility. At last year’s Hong Kong Electronics Expo, I met a Canadian guy from Calgary that runs a company that is entirely dedicated to creating alternative input systems for people with disabilities and with special needs. He had devices that could go in your mouth so you could control your computer using your tongue and by blowing air through a tube. He had eye-tracking technologies. All sorts of cool stuff. The prices were through the roof though. No individual could afford these things on their own. They would need support from a school board or some government institution. I think a lot of the more recent developments that we have posted about here show that there is a convergence happening where a lot of these technologies that were previously targeted at the disabled are also able to help all sorts of people to be more productive.
That’s a very valid point. Often times technology that is useful to address one type of learner is useful for all learners, in the same way that a teaching strategy that you implement for a student might be applicable for all.
1) Absolutely technology has improved. Has access to these devices improved as well? I think you only need to look around at how the prices for these devices have gone down and how many people are looking at them as a replacement device at home. I think that people are quickly finding that they can’t be a replacement but do many of the similar things. While I haven’t observed these devices transcend across socio-economic barriers — the mere fact that they can be purchased for sub $200 makes me think that they are. The lower end tablets and devices are also making their way into 3rd World countries as well as Melissa mentioned in a post about $20 tablets and the group’s references to $35 ones as well.
2)I think the innovation has lead to and will continue to drive the way for students to more intuitively engage with material. On one hand the student is looking at a computer to access information but the interactivity and one on one intimacy that a child can have — can allow for mistakes to occur more fluidly and have the child continue practicing and learning a concept. For children with different learning needs the ability to have touch software interpret their writing strokes can be powerful.
3) Manny nailed it with his reference to the “Moore’s Law”, it is difficult for schools to lead because they often want to be certain about technology they are placing in the classrooms because of the enormous costs involved. However, the most positive change that we can bring as educators is to embrace what is given and use it to its fullest. The applications can often become overwhelming and cost prohibitive. Finding alternative solutions (and developers often know this) — and finding a way to bring the skills to our children is ultimately what is most important. Schools need to be educated as do the teachers on what these devices can bring to the classroom.
In reflecting on my own experience, I have seen how the element of touch has greatly improved technology and education across age groups via the Smart Board. In considering why this is the case, I believe one of the most significant reasons, among other things, is that it inherently offers some form of interactivity which then captures the learners interest thereby encouraging them to engage with the device and the learning material itself.
In regards to geographic and socioeconomic barriers, I cannot speak much to its improvement from a personal standpoint, but I can say that after watching the Ted Talk with Sugatra Mitra, I was very much impressed with the results from his experiments with young learners in India and believe that it can be used as a key example of how touch technology can be similarly and highly effective for children across a range of socioeconomic and geographic locations if they are given access. As such, I think that embedding touch technology within education can lead to more valuable learning and higher academic achievement among learners and so while it may come at a considerable cost to the school and would require teacher dedication, it would be worth it.
Thanks for your thoughts.
Some things I’m reading/hearing:
Touch’s intuitive and direct interface may empower individuals with autism and other learning disabilities to become more fully participating members of society.
– Agreed. In the same manner that Touch enables 2-year olds access to using a computer like never before, Touch technology and haptics research are developing numerous ways to assist individuals with disabilities and special needs live fuller lives. So far, much of the evidence of the success of these efforts is anecdotal. And it is easy for the non-scientist – such as myself – to downplay the heterogeneity and complexity involved in creating effective technological solutions to such long standing problems. However, I do believe that NUI is a game changer and that its convergence with voice and gesture will significantly alter what is possible for the disabled and special needs in our lifetime. Agree/Disagree?
We can infer from Moore’s Law that the rate of technological change will outpace the rate by which schools can evolve and adapt to useful technologies in a timely manner.
Some solutions proposed:
1) try to keep up by staying informed and educating educators on developments in this area – i.e. dedicate more time and resources to education technology (oh.. I see where you are going with this 😉
2) you can’t keep up with it all, so focus your efforts around your School’s Development Plan (SDP), and what you need to adopt to achieve its goals.
3) promote more project based initiatives and empower teachers to carry them out as a way of experimenting with what works, before investing heavily into it – some have pointed out that such a pilot mechanism already exists, but it does not help address Moore’s Law.
4) the issue of technology cost is prohibitive. How do we increase access to technology without having more funding to facilitate it?
– These are all good analytical points in their own right. And I don’t mean to pull a Tyler Durden on you guys, but why don’t you all just come out and say it?
We are witnessing the end of the Industrial Revolution, and the system we have built for this era – be it our education system, or other hierarchical models of business/governance – are grossly inadequate to tackle the challenges we face in the future of the digital economy. An economy whose success will not be driven on compliance, or Command-and-Control as with the Industrial model, but with letting go of control to make way for co-creation and social intelligence mechanisms, to which perhaps this course can serve as a contemporary reference.
Fact is, in the information age, no matter how hard you try, you will always be too late.
And that fact is not inconsequential.
According to McKinsey & Company’s research paper entitled “An Inconvenient truth about change management: Why it isn’t working and what to do about it,” 70% of organizational change initiatives fail.
They fail because in the direction we are heading, wisdom lies in crowds, and not in the heads of a handful of executives and their SDP.
They fail because SDPs do not embody a program or an ecosystem that is needed to sustain change in the long-haul.
Finally, they fail because in the 21st century, if your focus is on a project and not building a community of knowledge needed to leverage learning experiences (the NMC or our course for example), then you have failed to leverage and access the knowledge you possess as an organizational whole.
As for cost, that too was a big impediment in the industrial era. But as Ben has pointed out, in the digital age the means of production are available to anyone – and concurrently, their costs are relatively minimal.
For example, Sugata Mitra did not need millions to show that those who were poor and in remote locations could learn through technological access. And the Khan Academy started with just Khan sitting at his pc and video recording tutorials for his cousins youtube; now, they these videos reach millions because the net makes them scalable.
If we look into the future, the issue is not a crisis of cost or the way to achieve our goals and objectives. At stake is a crisis of vision in modern education; one that has yet to fully realize that the technology we’re building is completely redefining the challenges we face and what is possible.
The Nobel Laureate Paul Samuelson once remarked “when events change, I change my mind. What do you do?“ In the 21st century, technologies such as touch, voice and gesture are changing the world in which we live. We can no longer afford to let our romantic attachments to the educational pedagogies we once grew up determine the way forward. If it is to be pertinent for the future, our education model must reflect the new realities of the digital economy. And I would argue that to achieve this, our education needs a new raison d’être. Perhaps that is why we are here.
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
A few year back I had a student in my class who could not speak or hear and was in a wheelchair because she could not move most of her body. She had a system that was touch based that consisted of different pictures that she could click on to form sentences that would be orally relayed by the machine. This was a great tool for helping her communicate with her peers and teacher. Later on in the year, her parent purchased a Ipad for her that she started bringing to school. The Ipad allowed her to pick up on many skills due to the various apps she could use. I got to see her knowledge base, confidence and communication ability/skills grow a lot in a year and was impressed by the power of touch technology and how it improved her leaning capabilities.
I have also noticed that touch technology has allowed for very young children to learn how to navigate and use tools that might not have been possible in the past. I have seen children as young as one years old click on a screen and try to manipulate what is happening on the screen. I have also seen how they pick up on these skills quickly and can be using the device and doing something productive by the time they are around three years old. It is amazing to see what these children are capable of and how easily they learn tasks. They can not read what is on the screen but the simple process of looking at objects, touching them and manipulating them allows them to gain knowledge. This is very similar to what Sugata Mitra observed in India. Children may not have the language skills but the simple ability to see and touch allows them to learn when using technological devices.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
I definitely think this innovation will open many doors in the educational realm. I think that touch devices will allow for a lot more personalization of learning and for more collaboration amongst students. It will also open a lot of doors for students with special needs.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
The main way would be updating of hardware and software to allow for capabilities such as touch screens to be present. In some parts this could also be accomplished through allowing of BYOD. I also think this will result in a restructuring of curriculum design and implementation. Learning will become more personal,each individual will be in charge of their learning and will be able to expand on their learning in the way that is most conducive to them self. Thus, this would require a restructuring of the age old parameters of knowledge delivery and reception.
Nureen
Thanks for sharing Nureen.
I agree that a solution to this problem might be BYOD and getting students to become greater participants and drivers of their learning. If we move forward with this aim in mind, what will students need to succeed and how will this change the roles of schools and teachers for students?
I think question # 3 is key, “What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?” The Horizon Report on Singapore, in the ‘Top Ten challenges’ (pp 19-20), mentions several important things, like the challenge to get teachers to adopt technology; integrating 21st century technology in schools that still function as if they were in a 19th century setting; adapting assessment to portray the kind of learning these new technologies afford, among others. However, perhaps the answer to these and other problems should be left to a group of kids to solve. Judging by Sugata Mitra’s experiment ‘Hole in the Wall’, I’m sure kids would come up with a lot of useful and practical solutions for the sake of having access to these new technologies.
(Thank you to this week’s team for introducing us to this very interesting experiment. If anyone wants to read up more about this, here is the site http://www.hole-in-the-wall.com/ .)
Hi,
You bring up an interesting point. I wonder how our perspective as educators form our view of these technologies compared to the perspectives of the students. We look at a technology and think ‘how can I use this to help students learn’, whereas students look at a technology and genuinely explore it, find entertainment with it, and can learn as a result, sometimes by accident. I think that it would be very useful for us as educators to just sit back and watch these students to see what type of learning is occurring as a byproduct.
Jhodi
1. I have not witnessed large changes to accessibility and change for large groups of students in my region. I do see that SMART boards as a touch technology is being used to impact how teachers and students interact with the material on the screen. On a smaller scale though I have witnessed how the iPod and iPad are being used with students with accessibility issues while using various apps. Some of the gesture controls that are available through both products have proven to be beneficial for many students. These are easy to change and adapt to the needs of the user. This allows them to control the device when vision or fine motor issues might normally cause problems.
2. I also think that the features mentioned in number one will continue to improve and will provide more opportunities for others. Some of the touch features in unison with the voice features we have looked at will continue to change the way people interact with their devices. Opening more doors for participation and collaboration for a wider group of learners. The best feature as presented on the Touch page comes with the ability to participate with these complex features without needing to be specialized or trained to use them.
3. Schools need to change, period. The system is not set-up to allow for innovative use of technologies in large school districts that follow centralized decision making practices. Decentralization will be the only way that schools and teachers will really be able to benefit from innovative products. Applications for grants for faculty is another way to motivate teachers to apply and become creative and strategic in their use of innovative technologies.
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
I have no personal experience using touch technology; however, through my search over the internet, I have come across lots of research that agreed that one positive result of using touch technology in the classroom is to promote collaborative learning interactions. For example this research “ Are multiple-touch surfaces better than single-touch for children’s collaborative interactions?” http://oro.open.ac.uk/19510/1/os-cscl2009.pdf interestingly concluded that the single-touch condition allowed only one child to interact with the digital content at a time, whereas with the multiple-touch condition, the children could interact with the digital content simultaneously. Results also showed that touch condition did not affect the frequency or equity of interactions, but did influence the nature of children’s discussion. In the multiple-touch condition, children talked more about the task; in the single-touch condition, they talked more about turn taking.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
Collaborative learning is important for 21st century learners; it helps students to become critical thinkers. Therefore any devices that can help promote this type of learning should be incorporated in the classroom.
In my opinion, touch technology will have more opportunity to thrive in a global classroom. Compared to voice and gesture technology, touch technology has a more international concept. People around the world tend to interpret and receive the meaning and benefit of touch in the same way. For example, babies that come from different cultures can benefit the same from a loving touch of other human beings.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
In order for schools to implement touch technologies, they must firstly see the benefits of using it.
Hi Paula,
I agree that a stronger case needs to made for the value proposition of having technology integrated into our education programs.
I would also suggest that we consider whose roles and interests would be threatened by pursuing such an approach and get them involved in the change process by finding ways that they too might benefit.
Hey All,
1. I have not witnessed an improvement in technological and educational access across geographical or socio-economic barriers, but I have seen a bit across age groups. My friend’s son is almost 3. The other day, he was showing me pictures of their trip to Edmonton that were on her IPhone and was able to enlarge himself and flip through the pictures. Although it was pretty cool that he could do that, I was more fascinated when he hit the home button, scrolled through several screens, found the game he liked, and started playing it. As someone else has mentioned, the makers of touch technology make it fairly easy to navigate from screen to screen, but that he understood how to hit the home key and find the game was still pretty fascinating.
2. I think that in the next 10 years we may go towards multi-touch desks or tables in the classrooms. Interactive whiteboards have some capacity of multiple people touching the screen, but it is fairly limited. With multi-touch desks, more people can interact with the table or desk and with the people around it at the same time. A multi-touch desk is not controlled using a mouse or keyboard, so it provides a way for everyone to interact with the system. We can already see this trend of touch screens with IPads (or similar devices) and with many cell phones.
Here are some of the concerns I have with interactive or multi-touch desks:
a) The cost – how much will it cost to have these desks placed into one classroom, let alone into a whole school?
b) Vandalism or destruction of these desks – students at my school like to change the keys on our laptops or write on their desks. How would we prevent damage to the interactive desks from occurring?
c) Does interactivity with technology actually improve students’ achievement? I think this heavily depends on how the teacher structures the lessons around the use of these interactive desks.
d) Further teacher training – teachers will need PD opportunities to come up with good lesson plans to use these desks.
Lisa
Lisa,
You mention some very real and legitimate concerns about cost and vandalism. I wonder if as an education provider, you might have some suggestions as to how we might mitigate these risks – perhaps based on what we’ve learned from integrating some technology into schools already.
As for PD for teachers, are there different ways we can go about this? For example, by having math teachers share their experiments and success stories with others like them? Also do teachers necessarily have to be the experts or can they some in ways let technology do the heavy lifting, while they become better facilitators of student learning and exploration?
I have noticed the change in technology to shift towards intuitive touch technologies. For example, a computer mouse has advanced from an independent object connected to the computer via a cord that moved a pointer on the screen in the same direction as you moved it on the table, to a trackpad for your finger moving a pointer on the screen on a laptop, to a tablet where your finger literally is the mouse and controls the pointer. Other technologies are becoming intuitive as well, such as the arrow on an iPhone that says ‘slide to unlock’. Technologies such as this could be used without the words; one could probably gather that they need to slide the flashing arrow in the direction that it is pointing without any words attached to it. Some of the best applications of touch technologies that I have seen have been used with students with special needs. I have seen great apps on the iPad that allow students to use their fingers to navigate and move things around. Students that could not use a mouse with a computer screen have been able to use their fingers to touch what they need to on a tablet. I think that one of the largest benefits of touch technology is it’s ability to communicate the same message across all languages using symbols, sounds, and intuitive gestures.
Jhodi
I tried to use Vocaroo this morning but after ten times I gave up. I am not sure if it was due to a setting I had on my computers (I tried both my laptop and iMac) but my voice sounded mottled and was almost inaudible. So instead I used a screencast program called Jing that I love. It records a portion of your screen and allows you to talk to it. I use it when responding to student’s papers and/or lesson plans as you can scroll through the document on the screen and use your cursor/arrow to point things out and talk to them simultaneously.
On my Jing I gave a review of Siri. Here is the link to my Jing. http://screencast.com/t/Alj9CGtd
Ok I am very embarrassed as I posted on the wrong thread. The above post should have been on the voice thread. I went into the dashboard to try to edit my post and couldn’t find it. When I click on my posts it only shows 11 of them and I have made way contributions than that. If anyone knows how to help me, I would greatly appreciate it. Sorry for incorrectly posting.
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
I haven’t seen this myself. What I’ve found, mostly, is that those who don’t have geographical or socio-economic barriers are the ones who are able to afford/access technology. As was mentioned in the website, it is these people who would benefit most from this kind of technology that are often unable to use it.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
I would hope in the future that it would be able to reach those people who need it. I see technology as a leveller, except that, as it is so expensive to acquire, these levelling possibilities are often unrealized. Touch could provide (as mentioned in previous posts) a whole host of opportunities for students who have physical limitations.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
Perhaps, as Peggy mentioned, we need to look at separating funding. If you wait for the bureaucracy in a district to play its course everything will be outdated. Perhaps the way we fund and “administer” education needs to change? Traditional classrooms might not be the way our students are going to be most successful. Who drives that change though? Students? Teachers? Administrators? Technology experts?
Rebecca,
You are right to point out that technology, like education itself, is a great leveller.
If we recognize this, then bringing technological access to the remote – be it because of georgraphy, socio-economic or mental and physical barriers – will empower them to become valuable participants in our societies; and that is something that benefits all of us.
To do this however requires leadership. And in collaborative leadership, each and every one of us has a role to play, be it as a student, teacher, administrator or technology. Would you not agree?
1) In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
Definitely I have seen touch improve technological access across age groups as similarly mentioned by both Scott and Jenny who discuss how it has been of great assistance to students with special needs. Additionally, as Manny discussed the intuitive nature of touch technology allows people of all ages to interact. I have just begun using iPads in my classroom and I am surprised at how few questions the students as of me as compared to when we are using iMacs in the lab. They seem to be able to figure things out on their own.
2) What do you think this innovation might lead to (opportunities) in how we provide and access education in the future? AND 3) What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
What I find most captivating is the discussion brought forth by Sugata Mitra showing how without teachers students demonstrated the ability to learn through technology. It connects to another theorist Seymour Papert who argues against learning by being told and instead believes learning should be acquired through exploration. Papert and others such as Ivan Illich believe that technology will not improve school but eventually replace them in the future. There is a great video of a discussion that took place between Paulo Friere and Papert on this topic. Here is the link https://www.youtube.com/watch?v=4V-0KfBdWao&feature=share&list=PL4UARNpBiEHpGbm7Vs4RbIgVTkKJ1HO5k
I find myself leaning more towards Friere’s hopes for the future of education. Although I see the need to incorporate technology in education, I hope that it doesn’t replace teachers. In the future I hope the system of education (our current model) goes through massive changes to reflect a paradigm… one that focusses less on covering content and more on understanding it. One that encourages personalized knowledge, collaboration, and critical thinking. I see the need for teachers to help guide and facilitate this process.
Question 1:
I understand the fascination with touch devices. I am, my children certainly are, and even my husband is as well. Papert makes in point in Mindstorm that we should leverage an active engagement with computer cultures to “develop new ways to think about thinking” and not as is done “in most contemporary educational situations where children come into contact with computers the computer is used to put children through their paces, to provide exercises of an appropriate level of difficulty” (http://www.arvindguptatoys.com/arvindgupta/mindstorms.pdf). Touch devices seem intuitive and lends themselves to users doing more – they have the potential to facilitate active engagement with the computers/devices. I have witnessed this with my daughter who is dyslexic bit has been activity engaging with the computer for meaningful learning. By not focusing on spelling errors, etc, she has been better able to search, find, make the linkages, and process and produce in her own way, using technology. Touch devices are a great tool for learners and in particular for special needs learners.
Questions 2 and 3:
I think touch devices lend themselves to “learning that happens deliberately without teaching” and “without curriculum” (Papert, 1980) – a classroom of one kind of approach. While this approach holds some appeal to me however I wonder about its practical merit. Planners/decision makers grapple with the constraints that limited financial resources have on the expansion of schools, adequacy of materials, hiring of staff, including teachers and purchase of or upgrade of current technological resources. Given the fast rate of technology obsolescence; a wholesale adoption/incorporation of this device might not be fiscally prudent. I like to think the use of computers in the constructivist sense is growing; more educators are seeking to use educational computer programs and – devices to incorporate affordances such as scaffolding, organizing, reflection, visualization and problem-solving into their lessons. While I think touch devices are great and could really facilitate many of these, I think that this can be achieved by incorporating many of the Web 2.0 GUI technologies/resources as that are available.
Sophia
jennbarker, please ignore this post. I meant to post as a response to the question in general and so I have re-posted in the correct area.
Sophia
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
The biggest impact that touch technology has had on education is in its intuitive adoptive nature. Touch technology provides a means to communicate by using a very natural gesture as opposed to mice and keyboards. In doing so, it allows the young, the old, the physically and mentally challenged to communicate using an interface that does not require much learning at all. The biggest hurdle at this point is making this technology available to all in an affordable way.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
We’ve already seen how the interactive nature of touch technology can motivate students to participate in their education. If the technology becomes widespread and affordable, it should provide equal opportunities for students requiring a more hands-on approach to learning. Touch based learning allows students to explore a greater amount of content that becomes available just at their fingertips.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
As for how schools need to change, it is mainly in the delivery of education. Schools will need to focus on a student-centred learning approach as opposed to the more common teacher-centred approach. Teachers need to learn that they must not be masters of content but rather become facilitators of content, resource guides, mentors, and support pillars for students. The student will need to take more responsibility for the learning and knowledge construction and actively participate in the journey. One major change for schools would be the introduction of technology resource department that constantly searches and analyzes new content, applications, and approaches to education in order to facilitate and support teachers’ adoption of these resources in class. It is becoming impossible to keep on top of educational technology and available resources as many companies are trying to capitalize on the booming market. Schools will need to assist teachers in finding the right tools for their students, for the curriculum, and for assisting teachers in providing the richest educational experience possible.
Question 1:
I understand the fascination with touch devices. I am, my children certainly are, and even my husband is as well. Papert makes in point in Mindstorm that we should leverage an active engagement with computer cultures to “develop new ways to think about thinking” and not as is done “in most contemporary educational situations where children come into contact with computers the computer is used to put children through their paces, to provide exercises of an appropriate level of difficulty” (http://www.arvindguptatoys.com/arvindgupta/mindstorms.pdf). Touch devices seem intuitive and lends themselves to users doing more – they have the potential to facilitate active engagement with the computers/devices. I have witnessed this with my daughter who is dyslexic bit has been activity engaging with the computer for meaningful learning. By not focussing on spelling errors, etc, she has been better able to search, find, make the linkages, and process and produce in her own way, using technology. Touch devices are a tool for learners and in particular for special needs learners.
Questions 2 and 3:
I think touch devices lend themselves to “learning that happens deliberately without teaching” and “without curriculum” (Papert, 1980) – a classroom of one kind of approach. While this approach holds some appeal to me however I wonder about its practical merit. Planners/decision makers grapple with the constraints that limited financial resources have on the expansion of schools, adequacy of materials, hiring of staff, including teachers and purchase of or upgrade of current technological resources. Given the fast rate of technology obsolescence; a wholesale adoption/incorporation of this device might not be fiscally prudent.
I like to think the use of computers in the constructivist sense is growing; more educators are seeking to use educational computer programs and – devices to incorporate affordances such as scaffolding, organizing, reflection, visualization and problem-solving into their lessons. While I think touch devices are great and could really facilitate many of these, I think that this can be achieved by incorporating many of the Web 2.0 GUI technologies/resources as that are available.
Sophia
Sophie, you make well researched and articulated points.
I think you hit the nail on the head with your argument that technological advance frees us up from previously labour intensive constraints such as teaching, and allows us to think more about thinking (meta-cognition) i.e. what it is we value about learning and how to use new technologies to better achieve them?
While fiscal elements are very much an operational reality, I don’t think that you mean to argue that special needs children should have their access to enabling technologies such as touch limited by this constraint. And I would argue the same logic holds for the geographically and socio-economically remote. Surely tradeoffs have to be made, but the end goal should be to give everyone access to the technologies that best allow them achieve their greatest potential along Maslow’s hierarchy of needs. And just as Touch requires us to challenge our beliefs about the role of education, I would argue that is will also require us to challenge our thinking about how to finance it, or at least, how to spend the money we have allotted to it already.
Touch technology is everywhere, and across all age groups. Outside of every Apple store you can witness the lifecycle sitting at tables participating in the Apple Learning classes. I personally have made the transition with several of my devices and the “Beyond” section for this week’s presentation is fascinating. I now have downloaded Dragon Dictation to my iPhone, using Flutter gestures on my iTunes music and have been exposed to some new ideas to include in my curriculum for teaching the delievery of care of the special needs patient.
In my experience (adult education) I have not witnessed how touch technology has improved technological and educational access across students, nor do I witness geographical and socio-economic barriers to accessing technology in the college environment. The student population is diverse both geographically and socio-economically – they have access to technology on campus both touch and non-touch and most appear to be suited up with BYOD – tablets, laptops and mobile devices. Similar student activity occurred prior to the launch of touch technology devices. I don’t see it as just touch technology improving access but the advances in technology addressing access and improvements, technology and pedagogy together is impacting the transformation of education. Digital learning is promoting interactive ways of learning, constructivist learning that is facilitated and self-directed, and touch technology has provided improved applications and methods for users to access and deliver information and perhaps this has impacted the way curriculum is delivered and addresses various learning styles such as touch, visual, interactivities, engagement. Touch has replaced the mouse and offers advances for the user, making it easier to navigate through information. Voice and gesture applications are advancements that are promoting interactive ways of communicating, teaching and learning. Gesture creates fundamental advances in teaching and learning for healthcare programs to offer ways to communicate and learn for groups of people such as those physically & mental disabled, and the elderly. Post- secondary education has transformed into a business venture offering education, the student is the customer and resources such as technology are in place to service the student who is the customer. Educational institutions that offer current resources and methods of learning are also creating a competitive edge in attracting future customers. In my educational institution technology/research and innovation is part of the 2020 strategy.
Catherine
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
I have been fortunate enough to have worked in a few classrooms that have incorporated touch technology to increase accessibility to the curriculum for special needs students. My older students (ages 14-22) were classified as Profound Mentally Handicapped with capacities determined to be those of infants. With hand over hand assistance, these students were presented with lessons in cause and effect that corresponded directly to their own actions. Though it was virtually impossible to determine whether or not the students actually fully understood the relationship between their actions and results, some did progress to activating touch technology on their own, with and without prompting.
My younger students were enrolled in a life skills class where we focused on simple literacy and mathematic skills as well as working on fine and gross motor skills. By utilising a removable touch screen on the existing school computers, students with lesser dexterity who may not have been able to move a mouse, could still access reading programs like Starfall. This school was on the lower end of the socio-economic scale of the district, but due to the efforts of the teachers in the program, the administration knew of the benefits and made budget allowances for such technology that could contribute to student success.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
I do think that children are going to be exposed to and start using this type of technology earlier and earlier. The Christmas adverts I am seeing here in the UK advertise many lower end, toy-like, child-friendly substitutes for iPads and e-Readers. I do think that the availability of such toys will be largely determined by socio-economic status, thus possibly contributing to the digital divide. In education, I have witnessed pilot programs (in the US) to provide every child within a school with a laptop. One school in particular was extremely successful, but the program was stopped when district funding ran out and other schools complained that they did not get the same opportunities.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
I think that administration needs to look at the budget very carefully, taking all costs into consideration and weighing the advantages of adaptable technology. (New books versus an update of existing technology, etc.) In this vein, I would hate to see schools adopting technology but not keeping up with it. I would compare this to using very out of date textbooks, which still occurs, especially when budgetary constraints become tighter.
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
No, I personally have not witnessed touch improve technological and educational across age groups. I have traveled a great deal, in the context of education, and I have not yet seen the impact of touch impact in other parts of the world. For example, I was recently in Bhutan, where i was teaching a screenwriting workshop. Most people in Bhutan have some kind of access to a computer at home, but I would estimate that less than 1% of people own a tablet. I would be surprised that even the King of Bhutan (who is actually now the Prime Minister/ democratic leader of Bhutan) would even have an iPad. I was also in Rwanda about 6 years ago. Of course the touch revolution only happened more recently, but again, I would argue that most people in Rwanda do not have an touch device, as many do not have power or running water.
Obviously the touch revolution has impacted the developed world heavily, however, I have not been involved with classes that have made use of this technology.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
I think initiatives such as the one by Mitra is truly inspirational and one way innovation in this type of technology will help impact the future of education in the world. I heard of Mitra’s Cloud Grannies a while ago and was very moved, as I believe more of these experiments should be undertaken. This coincided with the time that I started my PhD in Online Film Education. From my own perspective, I believe innovation will allow the developed world help the developing world. However, this very much depends on Internet availability.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access? I think BYOD will come into effect very quickly in order to accommodate touch technology, without causing major budgetary disasters. We recently had a coordinator’s meeting at my institution, where the topic of discussion was, taking computers out of the labs (because our students prefer to bring their own) and reinvesting into other kinds of technology that is not affordable by students.
One last thing, I wanted to mention. A friend of mine, made the film MADE IN CHINA. It premiered at Hot Docs this past spring. It is about the city in China where most things are made, including Apple products.
http://www.hotdocs.ca/film/title/made_in_china
Unfortunately, the conditions where these people work are not good at all, and the hours or these assembly factories are inhumane and many workers commit suicide. I feel obligated to relay this information in this context, as I was very disturbed to know that the world’s consumption of tablets (and most other things) is having such a detrimental impact.
Furthermore, the environmental damage by the waste or technological devices is also something we must consider, as the waste is only increasing exponentially. There are many relevant films and youtube clips on this subject matter as well.
Food for thought.
1. In your experience, how have you witnessed touch improve technological and educational access across age groups, as well as geographical and socio-economic barriers?
The only experience I have regarding touch in education is the SMARTBoard. I am not sure if it helped improve educational access because my experience with the SB is somewhat limited. One thing I knew it did was to help students interact with content without the restraints of a mouse and keyboard. I cannot say it helps students see content on a different level because the projections were 2D and the SB I tried was limited to single touch. From this experience I cannot say that touch has improved technological and educational access across age groups/geographical/socio-economic barriers.
2. What do you think this innovation might lead to (opportunities) in how we provide and access education in the future?
With multi-touch and 3D displays, I think education can be accessed on a different level. Perhaps it would enable learners to read, write, interact with the content (i.e. do science experiments) more “naturally” as they would using concrete learning objects but with the added support of multimedia and internet access. Being able to use one’s fingers to rotate, zoom in/out, and flip 3D objects would help tactile learners interact with the content more effectively and enhance the learning experience.
3. What are some ways schools might need to change, in order to lead and facilitate the innovations that touch can provide in educational technology and access?
In order to achieve this, schools would need to have access to the hardware that supports touch. This can be achieved by fundraising or partnering up with companies such as Apple/Microsoft. The former would require large sums of money and the latter might not provide enough hardware to the majority of students. These matters need to be discussed with the School Board and their respective PACs.
James
Throughout this unit we attempted to push the boundaries of emerging and existing technologies existing within the expanding contexts of voice, touch and gesture concluding with a section on the future possibilities being discussed now. All pages contain links at the top in the toolbar and buttons at the bottom, to help you progress through […]
Continue reading Welcome to Week 11: Voice, Touch and Gesture Posted in: General, Week 11:jenbarker, lullings, visramn and one other person are discussing. Toggle Comments
I simply wanted to take a brief moment to thank all of you for creating such a well designed Weebly website. Your carefully curated content, attention to details including easy navigation, clean use of type and page layout, along with CC sourced images, have resulted in a website which is full of useful content wrapped in an effective presentation. Kudos for your effort!
I echo Scott. You website looked extremely professional. I also liked the way you chose to set up each page with the headings voice, gesture, and touch. The videos you chose expanded what I already knew and opened my eyes to many future possibilities. Thank you for all your hard work.
I agree with Scott. It is a very good site and the examples on there are great. I really enjoyed watching all the videos and I feel like I have learned so much from your site. Thanks.
Nureen
Thanks Scott and Nureen,
Creating the site my main goal was that it was clean and simple. I wanted the content to be unrestricted by technology and visual trash – from your feedback I think I have succeeded with you both anyway. As for the content it was a collaborative effort of the team and your positive feedback is appreciated.
Most of us are aware that some form of voice recognition software exists. However, not a lot of us have actually tried it, so this section is designed to get you to interact with some form of voice recognition software during a regular daily activity, and then record your experiences, (both positive and negative), with […]
Continue reading Week 11: Voice Posted in: General, Week 11:joeltremblay, melissaayers, jameschen and 17 others are discussing. Toggle Comments
Hi all,
I decided to briefly discuss some of the pros and cons to using the Dragon Dictation app. The Vocaroo software asks for access to the camera too but I don’t believe it makes a recording, just wondering if anyone else experienced that. Anyways, enjoy……
Hi Manny,
I actually use the PC software to do essays and film reviews. It definitely works better than the app because it has the chance to learn from your voice and the accents therein. I’m curious if you think it changes accuracy due to accents etc. because I’ve found the Iphone app to be less accurate when I drop into Canadian drawl.
I believe my hyperspeed drawl is also unappreciated. Here’s my take on it. I could see it being really fun to do with friends. For instance: who can come up with the most wildly inaccurate dictation. I’m really interested in what it does for education in the future though….
I had a similar experience Rebecca. Not perfect, and I found myself trying to do a recording that was perfect. My school is all ESL, so I doubt too that I would be able to use it in my teaching right now.
Interesting that it came up with curse words Rebecca and Mike, especially since it pulls from your phones current dictionary (The one that you constantly update when you send texts etc.) 😀 Sorry it wasn’t more useful for you. Do you think the tone of voice rather than the speed had anything to do with it?
Hi,
I used the app Vling. Click on the link to hear my views on this app. http://vocaroo.com/i/s1qXtWcNco2p.
.
Thanks,
Nureen
Hi Nureen,
I’ve found that Vlingo is definitely accurate and is quite useful for hands free control of your phone. What were you using it for specifically? (feel free to reply via vocaroo if that’s easier for you).
I used it for emails, texts, to navigate on the internet, etc. I found that it could not understand names and kept pulling up different names. But that could be because many of the names in my address book might not be in the word bank they use.
Nureen
I uploaded Vlingo to my iPhone and found it easy to use but not accurate. For more of my thoughts click on the link: http://vocaroo.com/i/s1Cj45Ri7g8C
Hi Jenny,
I use Dragon on my Iphone over Vlingo usually although I have both. I’ve found that the accuracy is a little bit better as far as sentence structure goes but it doesn’t have the hands free functionality that Vlingo does unfortunately. Thanks for trying it!
I recently had a pretty good experience using gesture and voice with my android phone. There is a keyboard called Swype that everyone should try. It is fantastic. Instead of typing on individual keys, you put your finger down on the first key of the word, then slide to the next letter and the next without lifting your finger. After just a few minutes of playing with it, I was typing about 20 words a minute on my cell phone. I have now been using it for a couple of years, but it was recently acquired by Nuance, the same company that makes Dragon, Naturally Speaking software. Between the two options, I was able to take an idea that I had in my head for a picture book, and get the whole thing written, page by page, with just a cell phone.
One thing I did find however is that different writing mediums seem to promote a very different flow of ideas. The research seems to bear this out. For example, here’s an article that discusses how handwriting seems to stimulate the brain much more than typing:
http://www.nhs.uk/news/2011/01January/Pages/writing-versus-typing-for-learning.aspx
As someone who once aspired to be a writer, I definitely found that stuff I typed and stuff I hand wrote was very different in style. Now, with this sort of sliding, gesture-based keyboard and voice dictation on the table, it will be interesting to see what new research reveals.
Hi Ben,
Isn’t that similar to what Samsung has done recently with their smartphones? Interesting that they have tried to apply to keyboards as well.
They just included the Swype keyboard by default instead of the stock Android one. Same keyboard.
Quick review on Siri (Iphone), Siri (Mac) and Google (iPhone). I don’t usually use voice commands through my phone so it was nice to have an opportunity to play around with it again. I’ve created a Vocaroo (thanks for introducing us to it — super neat) about my thoughts:
http://vocaroo.com/i/s01JsQakIitT
The surprise find for me was the Google one in the end. I found it was able to predict my speech fairly accurately and quickly. I did a bit more research and found that indeed it was fast. Take a look at this one video where the person races Siri against Google!
http://gizmodo.com/5956433/google-voice-search-vs-siri-whos-the-best-iphone-assistant
Hi Jonathan,
One of the benefits of any google software tends to be the accessibility of it and how much more the developers listen to the community so I’m really not that surprised by the gizmodo article although I wasn’t aware of that in the first place. Thanks for thinking outside of the box and using Google voice as I completely forgot to include it 🙂
I concur with Jonathan on this one, internet searches on my iPhone 4 are a pleasure with the most recent voice enable search app from Google. The speed in particular is simply remarkable.
While I have found Dragon Dictation and Search convenient at times in the past, the lack of full integration with iOS, limits its overall usefulness I think. Which highlights an annoying problem many people have recognized, with technology of all sorts today, namely that because tech giants, such as Google and Apple, have ‘issues’ with each other on a bushiness or IP level, we the customers often suffer as as result.
Google’s voice technology appears to be superior to Apple’s at present, or at least for internet searching, yet Apple’s on going patent disputes with Google and other manufactures, mean that in some cases Apple customers are being forced to use an inferior product. This very situation recently played out with the whole map fiasco in iOS 6, which has left Apple customers with a poorly implemented map app compared to Google. Similarly, I believe my iPhone 4 has the hardware capability to use Siri, however software licensing constraints with Nuance, prevent the phone from being able to use this innovative new feature.
As a form of NUI, voice control is very exciting and still in its infancy really. I only hope that there can be more ‘democratic’ implementations of the technology in future.
BTW, I wrote this entire comment quite efficiently using Apple’s built-in voice dictation function, one of the stand out features in its latest desktop operating system, Mountain Lion.
Happy dictating all.
Interesting the complexity of the speech above. Especially abbreviations I’ve found that voice recognition software has big issues with. I’m curious if you were doing a lot of editing or was it just really accurate?
Great points Scott. I have not upgraded to Mountain Lion yet. When I was researching the Lion speech commands info it seemed pretty basic. It reminded me of those functions available to my iPhone 4. I think for many of these features time and resources will only make this technology better. As you mentioned they are still young.
I’m interested in just how different the functionality of the two operating systems are. I wonder if there are net comparisons out there as all I was able to find were comparisons between dragon and Mountain Lion or someone talking about the benefits of it:
http://www.gottabemobile.com/2012/07/29/osx-mountain-lion-review-dictation/
My vlingo review
http://vocaroo.com/i/s1IAQzWBrzEU
How was Vlingo as far as commands etc.? What worked for you and what didn’t?
A humorous clip showing how voice activation is far from glitch free:
http://www.complex.com/tech/2011/10/video-jack-donaghy-predicts-the-future-of-television-sets
love it…Jack is great, and I agree with him, not a big fan of the remote getting lost. I always thought that a page button like on a cordless phone would be nice if it was built in.
My Vocaroo review
http://vocaroo.com/i/s0WuSHJyHCu1
I tested this voice software for podcast possibilities. I found it to be very clear. I liked that I was able to download as an MP3 and was able to open in Media Player as well as Quick Time. I was also able to edit it.
Sophia
HAHAHAHAHAHAHAHA… I was trying to figure out exactly what the hell this was!!! Too funny 😀 I’m guessing vocaroo was useful for you if you’re using for something other than this course 🙂
Here is the correct link: http://vocaroo.com/i/s11GVVOnlOM6
Sophia
I’ve actually started using vocaroo for a lot of activities in the class including film reviews etc. Something else you can do is when you go to the link you right click on the player and save as an MP3 as well if you accidentally click the wrong thing.
Thanks for the tip.
My Voice talk review
It’s a little strange talking to the computer sometimes. My students always look at me funny when i’m doing it.
That’s because the technology is new to us and we haven’t gotten used to it yet. Most people I am sure would prefer to control the computer by talking to it that typing or using a mouse because for us talking comes more naturally. Furthermore with the push by some software maker to have the user personalize their software at the interface level in a way makes the user experience a personal one. This fits in with the idea of personal learning that we looked at here before and all the option that we looked at this week voice has the potential to make computer usage almost like communicating with another person. Maybe this can make the difference for learners who are technologically challenged in one way or another.
As I said on vocaroo the current offerings are all dead and robotic when the day comes that CG generated voice sounds more natural and is more customizable it serve as the pathway by which a learner may be able to grasp a difficult concept.
dragon dictation review:
Thanks for the participation Mike,
Interesting that you had some problems with the exclamation and the curse word censorship. My wife and I tend to use it for shopping lists etc. and have ceased using pen and paper and instead just speaking at Dragon and texting the list to each other.
Hi Mike,
I quite enjoyed listening to your experience. I also tried to get punctuation and I found that some things worked such as saying ‘period’ or ‘exclamation mark’, but that does make me wonder what if I meant period in the sense of a hockey game or something and not the punctuation? I found some issues with the app’s accuracy. I was frustrated with the app in that I would have to read over what I said and make corrections to the words because it would have totally different words than what I said. I do wonder how this app would work with different accents?
Jhodi
Here is my recorded reflection on using Siri. http://vocaroo.com/i/s0WQybGWkDuy
Thanks for providing this activity. It gave me an excuse to spend some time trying to use Siri, which I have not done before.
The issue about quick responses seems to be wide standing which is really interesting considering that apple always pushes the response time during their commercials. False advertising maybe?
I’m not sure if it is false advertising or optimistic that one will have a good signal all the time. When it comes to simple things, like playing a song or dialing a number the response time was fine. It was when I asked for more detailed things like the movie listings or directions that there seemed to be more pauses. I think we have gotten spoiled with the immediacy of Google, Bing, and Yahoo searches that if something takes 4 seconds we start to think it is broken.
This reminds me of comedian Louis C.K.’s talk when on the Conan O’Brien Show on technology and how it is wasted on us. The bit was called “Everything is Amazing and Nobody is Happy”. It seems as though the original has been removed from youtube, but you can watch his stand-up version but it is not censored.
He also does a great bit on Twitter, social networking, and smart phones. He talks about how we are living our lives through the little screen on the camera. This is similar to the point I was making about the AR goggles in another thread.
After reading Scott’s post I was really excited to try dictation on my iMac but unfortunately although I have upgraded to Mountain Lion, it doesn’t appear that I have the dictation feature. In my system preferences I only have “Speech” as an option. Scott, I am surprised that your text was written with correct punctuation. Did you have to edit it?
I am going to try Siri on my husband’s iphone tonight and will post my Vocaroo later. In the meantime, what is the name of the Google app everyone seemed to like? Do you download it from the app store and does it cost anything?
You enable voice commands in your Mac in the Accessibility section in the system preferences. It’s the last item on the list–“Speakable Items”. (I am on Mountain Lion too–btw, have you noticed a loss of battery life since you upgraded? My Mac was only a year old and my batter went from 6 or 7 hours down to about 3 after the upgrade, and it’s sluggish and awful. i wish I stuck with Snow Leopard.)
Yes Ben I have noticed the same thing. I don’t go anywhere without my charger. BTW I don’t have an Accessibility section. I went to system preferences, and under the section system, I have a microphone labelled Speech. I clicked this and was able to turn speakable items on and off but it won’t allow me to convert voice to text. It only answers a couple of questions.
In my system preferences, in the system section, I have an icon for Accessibility and I also have an icon for Dictation and Speech. Once enabled, I just have to click on fn twice to turn it on.
Hi,
As a language teacher I have always been interested to see what voice recognition machines do when you speak with an accent. I downloaded Assistant Personal Secretary for my phone and said: “Hello mate, got a light? Oh, and could I have a glass of water, please?”, but it clearly did not understand my ‘cockney accent’ (http://vocaroo.com/i/s0rCj0CL38QM). This voice recognition programs that transcribe what they hear still have a long way to go, particularly if you take into consideration that there are more speakers of English as a Second Language than native speakers. Still, it was a lot of fun. Vocaroo, on the otehr hand, I have used in my classes and think it’s a great tool.
As the technology gets cheaper and more accessible, I think we will see this change. While it is possible to ‘train’ these programs to better recognize your voice, we are likely to start seeing add-on packs and so on for more regional accents in the same way that we already do for different languages. Many of these programs are already modular and allow users to add additional functionality by downloading extra bits.
Here is an interesting overview of accents research and voice recognition software. It was an interesting read and outlines some of the challenges in terms of getting it to work:
http://www.phon.ucl.ac.uk/home/mark/accent/
As he explains, an accent mismatch between the speakers used to ‘train’ the software as it was developed and the user can lead to up to a 30% error rate. And it would appear that the solution is not going to be a one-size-fits-all program but rather by leveraging the modular nature of the software and including ‘language packs’ that may be based on specific accents rather than different languages.
I am willing to bet that once some open source players get in on the game, then individual groups will be able to create their own language packs based on a very specific regional accent. For example, as this technology is integrated into the browser, as happens with most technologies sooner or later, a company like Mozilla or Google may use an open architecture that will allow these sorts of plugins.
Hahaha… I like the translation that it provided for you. I wonder if the accuracy is representative of the entire population or if it is just localized to the Cockney accent? Also, Dragon for your home computer actually learns your accent and builds up a dictionary the more you use it so as you use it, it trains itself to your voice.
Hey All,
Here is my Vocaroo: http://vocaroo.com/i/s0Tc6DOwnwOj. Note-I have a cold and did not use a microphone, so quality not so good.
I have recently purchased a Galaxy S3 and of course there is voice recognition program with the phone. I have two options – one that is like Siri in Apple (S Voice or Galaxy) or a Google Option. The S Voice gives me options to use voice recognition to do things with the phone, while the Google Option allows me to search things on the Internet. The S Voice is pretty finicky. It doesn’t always do what I want, but maybe I need to pronounce my words better. I can see where voice recognition could come in handy (such as using it for a “To-Do list, texting a message to someone, etc.). I think I just need to play around with it a bit more.
Lisa
Interesting the difficulties you had with it. I wonder if that’s only on the G3 model since most people have reported google phones being superior as far as voice recog goes with reference to the comparitive apple products?
Hey Joel,
S Voice is a personal assistant and knowledge navigator for Samsung Galaxy SIII and Samsung Galaxy Note II. Apparently, it can help with opening apps, setting my alarm, updating Facebook, telling me the temperature, help send texts or place calls. I just find that it doesn’t always recognize the words I say very well. The Google option on my Android seems to understand what I’m saying much better and is quite fast finding me information.
Lisa
A company called Maluuba, that makes a voice recognition app for Android that is apparently better than Siri, just released their API so that other apps can tap into its functionality. So if you were working on an app, say an ebook, you could make it so that if the user has maluuba installed and running, they could interact with the book, turn the pages, whatever, using voice. Nice.
I downloaded both Vliingo and Dragon Dictation to my iphone through the itune store, took seconds. I had difficulty accessing my contacts through Vlingo, the voice recorded messaging worked well. I then downloaded Dragon Dictation and this application works fabulous. The voice recording of message is very accurate, with each pause a comma was added and you had to indicate a period. http://vocaroo.com/i/s0ZD6FuHKvrI
Catherine
Hi there Catherine,
Good to hear that you spent the time to actually compare the two services and that you were able to ascertain which was more compatible with your understanding and abilities.
I downloaded the Dragon Dictation app on my iPhone and found some positives and negatives to the app. I also noticed afterwards that this same program is on my work computer (I had seen the logo before, but had never tried opening it), and tried it on there as well. Here is the link to my Vocaroo recording:
http://vocaroo.com/i/s1w5koO8gXIP
Jhodi
Thanks for the reply Jhodi. Good to hear that you attempted a couple different options and didn’t get frustrated and walk away from it the way I might have done in your situation.
I tried to use Vocaroo this morning but after ten times I gave up. I am not sure if it was due to a setting I had on my computers (I tried both my laptop and iMac) but my voice sounded mottled and was almost inaudible. So instead I used a screencast program called Jing that I love. It records a portion of your screen and allows you to talk to it. I use it when responding to student’s papers and/or lesson plans as you can scroll through the document on the screen and use your cursor/arrow to point things out and talk to them simultaneously.
On my Jing I gave a review of Siri. Here is the link to my Jing. http://screencast.com/t/Alj9CGtd
Hi there Jen,
I was having problems with it yesterday and the day before at the school but I believe that was due to the internet connection and more importantly, the uploading ability of your line. It could also be due to Vocaroo itself having issues but who knows. Thanks for adapting and posting the alternative Jing though as I’m sure other people have had this issue.
My Vocaroo recording on my voice recognition software experience is at http://vocaroo.com/i/s0zSG8gFEUqy.
I think voice recognition technology would benefit students with a disability who are unable to write/type using their hands. Another educational benefit could be to help English Language Learners improve their spoken English by trying to train the software to recognize their voice. Other than this I wouldn’t recommend educators to make use of the technology – unless they have extraordinary patience.
James
Hi James,
Because of the quality of the vocaroo recording, you have a somewhat Siri esque quality to your Vocaroo recording. It’s interesting as it sounds like you typed it and a software speak it for you. Sorry it didn’t work out well for you, but we appreciate the perseverance.
Alright everyone,
As your guide through the mystical lands of futuristic and sometimes prehistoric voice command software, I bid you adeau and thank you for participating in our activities. Thanks you for your insight, ideas and professionalism. Cheers!
Thanks to the group for introducing this activity, as well as introducing us to the Vocaroo tool.
I have been interested in voice recognition for many years (since I started my computer science degree in fact) but have been disappointed by how slowly it had evolved over the years. However it look likes significant improvements have been made thanks to the introduction of competing mobile devices I think.
I tried Siri a while ago but like many others have reported here the experience was not too fantastic and I gave up not long after I started. I found it made too many errors.
In contrast I have recently been playing around with the Google Voice and find it amazing, the advancements in this technology are clear and I am sure the improvements will keep coming for both this product as well as in Siri and Samsung’s version. It’s something I will definitely consider using in both in a classroom and daily life.
Just for fun I tried Dragon Diction with the language set as French (Canadian) and it was a complete disaster, I know my accent in French is not that great but it was really nothing like what I said – not even a word. Once I switched it back to English it was a lot better but still made quite a few mistakes. I think I would need to get used to it a bit before I find it completely useful.
Overall I found that google voice was the best product for my dictating (and searching) needs.
The best part about the desktop version of Dragon Dictation is that it keeps track of the different nuances and tendencies that you have within your diction. The mobile app is somewhat limited in that capacity unfortunately and because of that you get the mistakes you talk about. Just last night I was using it for something, but was quite tired because of a particularly gruelling schedule of late and it was making all kinds of mistakes.
Before you publish your posting, please make sure that you select a category (eg. Introductions) as well as assign some meaningful tags (keywords).
If you want to add yourself to this blog, please log in.
joeltremblay on Week 11: Voice | |
joeltremblay on A3 Elevator and Venture | |
manny on Photo App Elevator and Venture… | |
adi on My last participation :-( | |
adi on My last participation :-( |
PulsePress is proudly powered by WordPress.
Spam prevention powered by Akismet
teacherben 6:01 pm on November 14, 2012 Permalink | Log in to Reply
You are also welcome to use this space to respond with your thoughts to any of the content that you saw on the Beyond page.
Doug Connery 8:04 pm on November 14, 2012 Permalink | Log in to Reply
The Weeks Fly by.
As I was reading through the postings here in Voice, Touch and Gesture, I started to wonder how these technologies compare to the other six modules we have been exposed to and played with since October 1. Yes the weeks fly by. So here is what we have seen recently:
Week 5: Apps
Week 6: The Cloud
Week 7: Augmented Reality
Week 8: Personalized Learning
Week 9: BYOD
Week 10: Digital textbooks
Week 11: Voice, Touch and Gesture
1. From an educational perspective, which do you see as having the most promise in the short term and long term for your teaching practice?
2. From an investor’s perspective, which would you buy into?
visramn 8:18 pm on November 15, 2012 Permalink | Log in to Reply
I think a lot of these overlap so it would be difficult to choose one. However, I would have to say BYOD is one that is currently very relevant in many schools. I personally can see how the adoption of BOYD in schools is changing the dynamic of schools because this is an initiative that has been taken on in the school district I work in.
As for the near future, I would have to say voice, touch and gesture is going to become a huge part of learning and education. So many learners respond better when their learning interactions are visual and tactile. Not to mention, learners of today are drawn to technology so it is inevitable that learning is going to move towards newer innovations such as these.
Nureen
Jenny Brown 5:57 pm on November 16, 2012 Permalink | Log in to Reply
Great questions to pose Doug. I will focus on Question 2. All of these have investor opportunities (although I am not sure how to approach BYOD) and obviously all have risks but also potentially high payoffs.
• For apps there is a lot of competition and it seems like popularity spreads mostly through word of mouth. Look at Angry Birds, which is wildly popular. It may be difficult to predict what apps will really take off.
• The cloud, in my opinion, is the future but I think only a few large initiatives will make it to the top and you have to convince the public to buy your product or solicit really good advertising.
• Really amazing application of Augmented Reality is still a ways off and I would see this as a high risk investment but one with potentially huge payoffs. Because the technology still has a ways to go there are potentially good investment opportunities out there.
• Personalized Learning I think has more limited investor opportunities than some of the other technologies but educators with the inside scoop of the pain point and if the product addresses it well might have an advantage investing in personalized learning technologies.
• For digital textbooks, I think the players are already in place and it might be difficult to penetrate the market.
• For Voice, Touch and Gesture, I think the opportunity is similar to that of apps. You could invest in a great technology that just doesn’t pick up or if you are savvy enough (and lucky enough) you invest in one that really makes a presence in the market.
lullings 9:30 am on November 17, 2012 Permalink | Log in to Reply
Hey Jenny,
I though I would weigh in there with some trends that I have noticed here in Ireland in relation to apps.
I am not sure if it is due to our shattered economy or to a general trend world wide but apps are now not carrying the strength they had. A lot of companies including one of the national broadcasters are abandoning their apps and are choosing in favor of putting a link on the app store. This allows an icon to be downloaded but it actually just links back to their site.
In the majority this is really only being done when the companies website is adaptive. This means that it actually responds to the size of the screen that is requesting the information. So if an ipad is requesting the site it then sizes itself dynamically for that screen.
This allows for easy updates of the site which will filter down to the ‘app’ as its not really an app in the traditional sense. The biggest company to do this was google. Since IOS6 Apple have ditched the google maps in favor of their own which are really bad. So google had a choice to go out and design, build and submit their own app or create a link app download to their site. They chose the latter.
As a result it could mean less of a app driven market and more of an integration between the app style market and the rest of the web experience.
Does anyone else have experience with this?
Stuart
Jenny Brown 10:14 am on November 18, 2012 Permalink | Log in to Reply
Hi Stuart,
I don’t have first hand knowledge on the possible decline of the apps market but just some thoughts:
If information is non interactive, a link to the site that formats to the type of device is a good choice. When you write a website, someone can see the page on different devices with very little development effort.
An app is a better choice when information is highly interactive because it needs to be optimized to work smoothly on the device.
Apps require more development effort as different devices require different programming languages and apps require more maintenance to keep them updated for the operating system.
For example, talking about Google, if we look at Google Docs (a website running in a browser) – it works really well on a larger screen (doesn’t matter if it is a PC or a Mac) but it not optimized for smaller devices. Microsoft Word, on the other hand, is an app that has a lot more functionality that needs to be optimized for the device it is being used on (PC, Mac and coming soon official Microsoft Office apps for smart phones).
So I think that both web-based and app based information will prevail as there are specific uses and advantages to both.
Doing a short search about the Google Maps App, it looks like it will be making a comeback to the iPhone: A new report from The Wall Street Journal suggests that Google has already distributed a version of Google Maps for Apple’s iOS for testing. The insider that spoke to the paper explained that Google is trying to make sure that the native Maps app will be ready for prime time before submitting it to the iTunes store – though the exact timeline hasn’t been given.
Apple aficionados can breathe a sigh of relief knowing that help is definitely heading their direction, as it’s now only just a matter of when. If there’s still any doubt, Google did say that its goal is “to make Google Maps available to everyone who wants to use it, regardless of the device, browser, or operating system”. http://www.androidauthority.com/google-maps-app-inches-closer-release-apple-ios-132034/
Patrick Pichette 1:53 pm on November 17, 2012 Permalink | Log in to Reply
From the topics that were explored, I think that Voice, Touch, and Gesture has the highest investor and educational potential in the short term. The technology is still has yet to make its way in classrooms on a widespread basis but its ability to equalize learning opportunities across all age, geographical, socio-economical, and racial ranges gives it the greatest chance to give education a facelift. As the technology is can also be interlinked with apps, the cloud, augmented reality, personalized learning, BYOD, and digital textbooks, it also gives it a higher level of penetration into education from many different angles. I see the greatest potential in personalized learning tools and will be exploring ideas in this field making use of voice, text, and gesture technology.
As for long term potential, I think augmented reality may have a chance in this area as it isn’t quite ready for adoption in an educational market but should likely heavily penetrate the consumer markets in the next few years. My fear is that this technology will likely be boom or bust in many cases so it’s difficult to gauge the area to invest in that could lead to a successful venture. From an investor’s perspective, this could definitely be a high risk, high reward area to explore.
lullings 4:15 pm on November 15, 2012 Permalink | Log in to Reply
Personally from an investor’s perpective voice, touch and gesture would be the most appealing (hope I am not just favoring this week!)
The main reasons for this is that its new, adaptive and most of all it can be brought into a multiple of different environments (games, education, health etc). This would be essential for creating a large customer base and allow for cross learning which would continue to improve the product/service.
visramn 8:10 pm on November 15, 2012 Permalink | Log in to Reply
I have to say I was blown away by the different technologies out there. It is amazing how everyday objects that are in our home such as mirrors and windows can be turned into a technologically functioning devices. I found all the videos so interesting and am excited to see the surfacing of these technologies all around me. I personally think this era of innovation is amazing and way beyond my imagination.
Nureen
teacherben 8:22 pm on November 15, 2012 Permalink | Log in to Reply
If you have a chance and live in a big city, look out for any tech exhibitions. They are always a lot of fun. I went to one in Hong Kong a month ago with something like 10,000 vendors showcasing all sorts of crazy stuff. I pretended that i had my own company selling educational hardware and software and discussed purchases in the thousands of units and got estimates for shipping and all sorts of things. i was a laugh and definitely an educational experience. (If you are prepared to purchase a minimum of 1000 units, you can buy a Chinese-made tablet with specs similar to the new Nexus 7 for under $70USD per unit.)
kstackhouse 3:38 pm on November 16, 2012 Permalink | Log in to Reply
A few thoughts on each of the topics….While many of the videos showed cool and neat things I kept thinking…How lazy are we going to become? We complain now about having to get up to change the channel, then we complain about not being able to find the remote, so we create a voice activated system or a gesture system…Next we will have eye scanners so the TV can respond to our eye movement…Just Kidding, unless someone thinks this is a good idea and we can make some money from it…If that is the case, let’s talk. 🙂
I just wonder where does it stop, maybe it doesn’t. I personally would not want the interactive goggles to look around. I like looking at nature and taking it in without seeing it through a screen or having AR info floating around.
I can see that the windows or table top monitors could be something that people will really go after. From an educational standpoint I think that these systems will be great collaborating and sharing tools. I think there are some exciting potentials here. I also think the DIY movement is the best way to take advantage of Connectivism and Constructivism learning approaches. I could see a cross curricular project where the Environmental Science, Biology, and Computing courses work together to solve issues like energy and create systems and models to try to develop understanding and solutions for the issues tackled.
teacherben 4:35 pm on November 16, 2012 Permalink | Log in to Reply
Surprise! We already have TV’s that you control with your eyes:
http://www.bbc.co.uk/news/technology-19441860
teacherben 4:56 pm on November 16, 2012 Permalink | Log in to Reply
My dream is to have glasses that can zoom in. My eyesight is pretty poor and I live in cities where I am always surrounded by things that I just can’t make out. If I could just make a little gesture and my glasses would zoom in so I could make it out, that would be awesome.
Apparently DARPA is already working on such a thing. Funny how so many of these ideas start with the military:
http://www.theregister.co.uk/2010/12/23/darpa_computational_cameras/
kstackhouse 10:25 pm on November 16, 2012 Permalink | Log in to Reply
Thanks for the links. That is hilarious…I guess I will have to think of another scheme to get rich. 🙂
sophiabb 3:24 pm on November 17, 2012 Permalink | Log in to Reply
Yes, that would be great. Mine is very poor and getting worse by the second from all this computer usage. No need for progressives that seem to be hit and miss.
jenbarker 12:47 pm on November 17, 2012 Permalink | Log in to Reply
Ken ~ I laughed when I read your post. I am one of those lazy people that would love to be able to command any light switch in my house to turn off with my voice. I don’t consider myself a lazy person by nature but I think that when I choose to sit down and relax, technology such as this affords me an ability to truly relax. In a house with five people our remote continually disappears. I think being able to control the television with our voice would solve an ongoing problem. I am not sure this makes us more lazy or if simply solves a problem.
We built our house three years ago and remote technology such as this existed for light switches, thermostats, and stereos but they were very expensive at the time. I see an environmental benefit to these technologies. One example would be that I could control the heat of my home while on vacation. 24 hours before arriving home I could turn the heat in my house up and keep it down for the week prior. This would save energy and money.
Thanks Ben for opening my eyes to all the great possibilities arising. You appear to be on the cutting edge of what is available in technology.
melissaayers 3:25 pm on November 18, 2012 Permalink | Log in to Reply
Ken, I like to think of it as me being efficient not lazy when I see how cool many of these technologies are and I want to integrate into my daily life 🙂
jhodi 8:32 pm on November 16, 2012 Permalink | Log in to Reply
I would love to see an app used to create generative art, such as what was seen in one of the videos, that could then be used in a math class to explore scale drawings and how you could use a phone to zoom in and out and explore the scale used each time.
Jhodi
teacherben 2:09 am on November 17, 2012 Permalink | Log in to Reply
There are tons and tons. Processing (that was used in the video) is a language that is frequently used to create generative art and there are sites where people have uploaded their projects so you can have a play. Try openprocessing.org or studio sketchpad.
Here’s an amazing on the a guy I know created:
http://www.openprocessing.org/sketch/64760
(He has created over 100 tutorial videos to get you started: funprogramming.org)
If you have an Android phone or tablet, then you can create your own apps from other people’s work and run it on your device. Here’s how:
Install Processing on your computer. Install the Android SDK on your computer. Find a Processing sketch that you like on one of those sites and copy/paste the code into Processing on your computer. Plug in your Android device. Click the button that says ‘Android’ then ‘run on device’ and it will install. Presto!
Processing is a great way to teach about generative art since it is so easy to learn. You could have kids creating awesome visuals in a couple of lessons.
Have fun!
cunnian 1:16 pm on November 17, 2012 Permalink | Log in to Reply
I must say that I was incredibly impressed with the Makey Makey toolkit. I hadn’t seen this before and initially had difficulty thinking of how I could apply this in my classroom largely because what it affords is so new to me. After some consideration, I came up with an idea to use this in my IB technology 8 class for students to design and create a new video game controller for the video games that they will already be creating using Scratch. The advantage is that they can create an input device that best suits the game that they will create. This will require them to think more deeply during the design cycle about the game that they create and how the user will interact with it. Fun idea, but not cross-curricular.
Another idea that would be cross-curricular (but completely impractical) is to use voice-activated robotics in my French class as a means of improving pronunciation. Students could identify, practice and input terms that they find hard to pronounce and then determine if a francophone could operate their robot. This I would be a more engaging way of having students practice and apply their learning in a way that gives them feedback in a novel way.
A final and, admittedly, still somewhat impractical use of this would be with my student rowing club. The blades (oars) could be rigged up as input devices and interface with a drawing program on a mobile device. As they row in, a pattern should emerge on the screen which they could then interpret to determine if they are pulling in the most efficient way.
At any rate, these are some ideas that immediately came to mind. This is definitely a cool technology that I will have to get my hands on!
teacherben 6:25 am on November 18, 2012 Permalink | Log in to Reply
Your MaKey MaKey/Scratch idea is great and totally do-able. You could also do it with a Scratch Board/PicoBoard which was designed specifically to work with Scratch or with the MaKey MaKey. I bet that your kids would surprise you with how many ingenious ways they come up with the use it to input a ‘key press’. I would love to see the rowing one. It would be a bit trickier, but could be done. I went to a workshop a couple weeks ago with one of the guys from the original Scratch development team and we played with MaKey MaKeys for a while. That was my first time using one. Another teacher and I built a system that attached to his glasses so we could see how long he had been wearing them. We used Scratch as well, and I figure that your blades in the water would be measured the same way–it just counts up so long as the key is pressed and stores that in a variable. Cool. Thanks for sharing. Both the MaKey MaKey and the Scratch board cost about 50 bucks a pop though which is a bit steep, although I’m sure you can get your money’s worth out of it. Apparently the newest Arduino model, called the Leonardo, can also be recognized as a keyboard when you connect it, so it can do basically the same thing (and a whole lot more) and those only go for 25 bucks apiece. If you want to give it a try, keep in touch. i have a grade 7 class doing Scratch projects right now and will probably try to do something similar.
sophiabb 3:20 pm on November 17, 2012 Permalink | Log in to Reply
Whew! I am really impressed by all the technologies presented. Why am I feeling obsolete – both as an educator and a parent?
teacherben 6:02 pm on November 20, 2012 Permalink | Log in to Reply
I just found a cool project. A group of hackers have been working together to help a well-known graffiti artist who has ALS and is now completely paralysed from head to toe. All that he can move is his eyes. So they hacked a Sony playstation 3 and used parts to make something they call Eyewriter (http://www.eyewriter.org/) that allows him to paint with his eyes. Amazing what people can do.