Dual Screen iOS apps and synchronization of media channels

In my model on ambient learning support called AICHE I have described several processes important for learning with ambient and embedded information technology.

The main processes that I have foreseen in 2009 have been aggregation, enrichment, synchronisation, and framing. See the complete paper here

Every now and then I see new apps or model popping up that support this idea but yesterday I really saw a nice fit and I think some quite nice interactive way of developing dual screen apps for iOS with Airplay for Apple TV.

Brightcove introduced a cloud-hosting service for apps which support the development especially of dual screen applications, a model that can recently be seen in several new hardware and software concepts introduced. New home entertainment consoles as the Nintendo Wii U which brings a tablet with additional meta-information and controller options of the introduction of the  “My Xbox Live for iPhone” for the XBOX360 follow a comparable model. SYnchronize different screen real estate and adapt the information adapted to the functionality needed of the function best supported by the device.

I think this is a general trend that can be seen in entertainment and mobile technologies that can be pretty useful for learning support. In the video of the introduction of the brightcove app cloud for dual display apps even educational applications are introduced as one model. The example is having multiple choice questions displays on a tablet while watching TV. 

http://youtu.be/A1RjXsLZ2Ik

BTW from the Appleinsider article: “A recent survey by Razorfish and Yahoo of more than 2,000 smartphone users found that 80 percent of respondents use their mobile device while watching TV.”

 

Seminar on Tablets in Education

 

On the 16th of may I gave a keynote at the “seminar on the use of tablets in het onderwijs”.  There have been about 250 people from Basisschool, Secondary School, en Higher Education interested in the use of tablets in education.
 
I gave an overview of why tablets make a difference and what characteristics you should think of when designing the use of tablets in schools. Find my presentation in space for download:
 
Specht, M. (2012, 16 May). Tablet technologie in het onderwijs. Presentation given at the session on tablets in education at PH Limburg, Hasselt. Belgium.
 
The following presentation by Bart Boelen (Apple Distinguished Educator) gave an overview of the Apple Ecosystem and all iPad related services, apps, and functionalities. Learn more about his activities at -> www.schoolbytes.be
 
The third talk demonstrated the use of Android tablets and the Google EcoSystem as Google apps for education and how these can be used in the classroom. (Wouter Bouchez, http://www.tabbled.com)
 
Fourth talk was a presentation of Windows 8 and it’s cross platform capabilities. Lots of setup seems to be quite similar to what Apple is doing: Cloud based documents and apps, app store, consistent touch well interface. One difference is well that the interface design seems to be consistent for all devices. Presented by Jurgen van Duvel, Microsoft Education.
 
Last talk was about if kids and teenagers are actually waiting for tablets in school. This was a great and inspiring talk from Pedro De Bruyckere, see his blogpost at: http://xyofeinstein.wordpress.com/2012/05/16/tablets-in-het-onderwijs-mijn-presentatie-voor-de-studienamiddag-van-phl/
Pedro is a great speaker keeping a fine balance between critical perspective on technology and constructive and inspiring recommendations on the use of technology.
 
 

TEL dictionary Adaptive Learning Environment

Thanks to Nicolas Balachef who is Chair of the STELLAR TEL dictionary, I have provided a new netry on the TEL deictionary about Adaptive Learning Environment.

In the entry I give a high level view on Adaptive Learning Environments and components. I did quite some research on this topic whi h actually started in my Doctoral Thesis at the University of Trier.

I started the thesis in 1995 and worked for about 3 years very intensively on the topic. Mainly implementing systems and doing experiments on adaptive hypermedia approaches. In this time also Peter Brusilovsky was a visiting scholar in Trier and I enjoyed this time very much and I think it was a very intense working group on the topic.

Also this time had quite some impact on research and market for adaptive educational systems I would say.

Peter Brusilovsky off course is “Mister Adaptive Hypermedia” and still doing lots of interesting research in this area. My doctoral supervisor Prof. Gerhard Weber developed the system we have been working on (based on CL-HTTP, Common Lisp Hypermedia Server from MIT) into a very powerful adaptive learning environment with authoring tool which is marketed under NetCoach as a commercial learning solution. Gerhard Weber also provides lots of very good online courses and materials on the PH Freiburg Learning Server. And still the grand father of all these systems is online the LISP tutorial so if you want to get in AI and learn some LISP this is the place to go ;-).

In my research there I explored some issues on adaptive annotation of hypertexts, adaptive recommender systems, adaptive user interfaces, and others. I have implemented prototypes for adaptive task sequencing, adaptive task selection and user models with different algorithms for modeling user knowledge. Interestingly a lot of the personalisation and adaptation discussion still circles around the basic concepts we have discussed in these days, but of course there are new ways to aggregate and collect data about users, environments and other sources for adaptation.

See Nicolas Blog post about the entry at http://theo-rifortel.blogspot.fr/2012/05/adaptive-learning-environment-new-entry.html.

Some of my main papers about the works in adaptive hypermedia are:

Specht, M. and A. Kobsa (1999). Interaction of Domain Expertise and Interface Design in Adaptive Educational Hypermedia. Proceedings of the Second Workshop on Adaptive Systems and User Modeling on the World Wide Web at WWW-8, Toronto, Canada, and UM99, Banff, Canada, 89-93.

Specht, M., Oppermann, R., (1998). ACE – Adaptive Courseware Environment. The New Review of Hypermedia and Multimedia, 4 (1998), 1, 141 -161. 

Specht, M., (1998). Adaptive Methoden in computerbasierten Lehr/Lernsystemen. In: GMD research series, 98, 24, GMD, Sankt Augustin, (1998), 149 pages , ISBN 3-88457-348-9

Schoech, V., Specht, M., Weber, G., (1998). ADI – An Empirical Evaluation of a Pedagogical Agent. World Conference on Educational Multimedia, ED-MEDIA98. 1998. Freiburg, Germany

Specht, M. (1998). Empirical evaluation of adaptive annotation in hypermedia. In T. Ottmann and I. Tomek, editors, Proceedings of the 10th World Conference on Educational Telecommunications, ED-MEDIA & ED-Telecom ’98, Freiburg, Germany, pages 1327–1332, Charlottesville, VA, 1998. AACE.

Specht, M., Weber, G., and Brusilovsky, P., (1995). Episodische Benutzermodellierung zur individuellen Interfaceadaptation. Workshop Adaptivität und Benutzermodellierung in Interaktiven Softwaresystemen. München: Siemens AG. 

 

 

 

Matchmaking TopSectoren Media & ICT

 

Yesterday I participated in the Media and ICT matchmaking meeting in Hilversum. I did four pitches and as the organisers told I was the most busy pitcher ;-). In the pitches you could not make use of any slides and had 2 minutes to present and 3 minutes to answer questions.
The areas for pitches and my short stories have been:
 
  • Business Innovation Media and ICT: The Open University is busy with a new kind of educational model and business model behind it. We have to think out of the box to make the OER models happen but there are more and more success examples (flat world, coursera with Stanford courses of 200.000 participants, online MOOC, …) I am reall y curious to get this off the ground in the Netherlands so my pitch was to find the best partners to launch new initiatives for HE and CPD based on OER and online learning networks. 
  • Design thinking: I started of with the long history of CELSTEC/OTEC on instructional design and new developments in orchestration of learning. There are new needs and ways for designing learning in an always on, always available, always information world. We look at real time data streams from sensors, build artefacts and media representations that help users/learners to understand their environments and their own learning processes. We look for partners in this endeavour as also for companies that want to apply our research results to create new environments for performance support, active learning and curiosity.
  • Big Data: Starting with my history in user modelling, learner modelling and adaptive systems. The OU has a long history in personal learning support via online media. Nowadays the game has changed in social media you have lots of data. In our focus topic Learning Analytics we aggregate all these data sources and do research on how to mirror these data to learners to understand how can we support personal learning, reflection, performance support, and individual developments. In our pitch I was calling for partners to implement systems in their daily practice based on our models and technologies in LA. Furthermore I pitched for our activity on shared research data sets based on the DataTEL project. 
  • Smart and Social Media: I was talking about some of our experiences with learning media and metadata for media. Dropping numbers as that in 2011 mobile gaming was about 11 bil dollar business and a forecast for mobile learning is a 6.8 bil EUR business in 2015. Offerng opportunities for cooperation in research, laboratories, and education on new media. Based on current research programmes on learning networks and learning media CELSTEC is adding value in understanding how to use serious games, social media, and mobile media for learning and professional development. 
Comments welcome!
 

 

ARLearn published on Google Play

Today, I published ARLearn on Google Play, formerly known as the Google Market. There are still some glitches in the tool, but we decided to give this a try and start collecting feedback. So, this is a warm invitation to send us your ideas or feature request.

For those that are new to this topic: ARLearn is a toolkit that combines field-trips, augmented reality and serious games. With the toolkit one can create “games” (e.g. a simple field-trips) and “runs”. A game is a blueprint and captures the design of your mobile activities. A game can be materialized into many runs. Within a run, a fixed set of users can act and even compete. Also, actions performed by users, response that were given, etc are all collected within a run.

The tool that we released today is fully functional. However, so far we have not released the authoring tool yet. So if you want to be on the bleeding edge, go ahead and download. Drop us a note and we will help you to get your first run installed. If you rather see yourself as an early adopter, we advise you to wait for some more days. We are now working hard now on finishing the beta version of the authoring tool for ARLearn. With this tool it will become possible to create your own field trips, play them with students and collect results.

Read more about ARLearn.

Download the application on google play.

SIKS course on mobile, ubiquitous, and contextual Learning

Last week thursday friday we co-ordganized a SIKS course on mobile, personal and contextualised Learning: the “Advanced SIKS course on Technology-enhanced learning”.

In that context I gave an introduction to mobile and ubiquitous and contextual learning and highlighted some main aspects about mobility, context and what is important about these issues for learning.

The slides are available at http://dspace.ou.nl/handle/1820/4227

After the introduction we did the Invent your App session and did get quite interesting results of new apps that take into account the the students presented their apps as if they are planning for a new company with a highly innovative product. Was quite fun!

 

Augmented Reality eLearning 2012 talk

Today I gave an expert session on “mobile augmented reality (MAR) in het onderwijs” at the eLearning congres in the Brabanthallen 1931 at the eLearning congress 2012. Background information on the event is on the webpage http://www.e-learningevent.nl/e-blog. The session was full and I had some positive feedback on the session. Basicaly I presented some patterns about how to use MAR from shared perspectives on 3D objects to collaborative notes sharing attached to physical objects. The more I talk about AR and learning i think it is linked to the synchronisation process in AICHE, i.e. the key question for successful use of AR and MAR is how you design the link between the digital media and the physical environment.

The fair is much more focused on business market and professional training compared to IPON. Some interesting talks and demos I have come across. iTour 360 is one of the e-learning award winners that does some stuff quite comparable to what we do with ARLearn, it is a nice tool for building tours and deliver on mobiles and has some aspects about integrating real time information and location-based filtering.

All information from my talk is available online at:

 

 

 

Mobiele Technologie Talk at IPON2012 (updated)

 

Just some reflection on my keynote at IPON (ICT Platform Onderwijs) in Utrecht today.

first find my slides in dspace at:

 

Specht. M. (2012, 29 March). Mobiele technologie in het onderwijs. Presentation at the IPON 2012 ICT Platform Onderwijs, Utrecht, The Netherlands.
 

Basically IPON is a trade show with a variety of podia and stands of all kinds of technology and software companies offering services on the dutch market.

I gave a talk about “mobile technology for learning” mainly following the argumentation line:

  1. A calculator is a mobile device used since ages in the classroom.
  2. smartphones deliver the same functionality and much more!
  3. so but there is one problem …
  4. kids do not want to use it for the things that teachers want them to do.
  5. So as a follow up I showed several apps and some ideas on how to use them in the classroom and beyond as the main power of mobile devices lies in that they

    • are a bridge to the world of the users
    • these are personal and always with them devices
    • they can be used in and outside the classroom
    • they can collect media of all sorts
    • they can be contextualized (or adaptive to the situation)
  6. some of the apps that I used:

    • iStandford, Moodle Mobile
    • audioboo, Evernote
    • weather, Nike+, bar coo, wikitude
    • whatsapp, Twitter, Facebook
    • ARLearn, Mooble

Basically one can say already with quite some basic tools you get out of the box you can implement mobile and ubiquitous learning support today. In most cases users should focus on one feature to be introduced at the time, and understand what they can do with it.

After that I watched two other sessions: Guido van Dijk was talking about Connect College and how the school developed a vision leading to the ICT implementation strategy. The result of this project was a quite impressive movie about the vision statement of Connect College which I will post as soon as I get my hands on.

A second session was announced about Bring Your Own Device, but to be honest as a conclusion of that session one can say: “You need a safe and stable network on which users can log in with different devices!” Yes! and of course you should purchase that with supplier XYZ.

In general there have been quite some interesting sessions on the trade show, further I have not seen so many interactive whiteboards on a trade show ever. Ok the last years I have not been on trade shows in general. So in this sense it seem obvious, lot of vendors want to sell their 80″ touch screens that you can use as table or hanging on the wall. This seem to fit quite well with our research works on Ambient Displays and how to use them for learning. Nevertheless most of these solutions are really expensive still, so you can ask: Should I school really invest in screens of 10.000 each? ups.

 

Toolkit for DIY Sensor Recording

Today I came across a toolkit for building corss platform DIY sensor recording apps on iOS and Android.

AntiMap http://theantimap.com/

“The AntiMap is an Open Source creative toolset for recording and visualising your own data. The project currently consists of a smart phone utility application (AntiMap Log) for data capture, and a couple of web/desktop applications (AntiMap Simple and AntiMap Video) for post analysis and data visualisation.

Our aim is to produce new and creative representations of data. If you would like to contribute or have created a visualisation you would like to share, please contact us.”

 

They have a nice snowboarding video:

AntiMap Video application: Unofficial snowboard edit from Trent Brooks on Vimeo.

Music: The XX – Intro (1984 remix) http://soundcloud.com/the1984/the-xx-intro-1984-remix.

My aim for this project is to aid in the progression of snowboarding and skiing by means of gathering real-time rider data and post analyzing with video synchronization.

HOW IT WORKS:
I used an Android phone (HTC Sensation) with a custom built application (AntiMap Log), placed upright against my lead hip/waist inside my pants to log all the stats and information. This position is the most stable and yields the most accurate results for spinning/rotation when snowboarding. Just placing it in any of your pockets works fine for everything else except rotation as it moves around when loose.

Video was captured with a Go Pro camera. In the first segment of the video I had it attached to my helmet, and in the next segment I’m just holding it (I forgot the camera strap – idiot!). Whilst I decided to film myself for these early tests, having someone else do the filming would be ideal.

So data and video are recorded separately to keep the riding experience as unaffected as possible. Then once your pow riding day is over and you’ve recorded that perfect run, you can synchronise your video and data easily with the AntiMap Video desktop application and play it all back.

POTENTIAL USES:
– Real time snowboard/ski games. I was originally inspired for this project by playing Shaun White snowboarding on Nintendo Wii.
– Making personal snow/ski movies.
– Training/tutoring tool.
– Competitions. I’d love to see technology like this used in an accompanying role at televised events such as the Winter X Games. Giving spectators a bit more insight through data and stats would be invaluable.
– Whilst I specifically built this application for snow/ski, it could just as easily be adapted to suit other sports such as mountain biking, skateboarding, parkour, gymnastics, even running or walking.

TECHNICAL DETAILS:
Data is gathered through a smart phone utility application built in Processing called ‘AntiMap Log’. The application logs latitude, longitude, compass direction, speed, distance, and time to a standard CSV file at 30fps. Currently Android only, iPhone version is under development.

The post analysis application, ‘AntiMap Video’ is a desktop application built in Openframeworks. It allows the logged data from the mobile application to be synced with video footage (not captured with phone). The standout feature of the AntiMap Video application is spin detection, which uses the compass data to accumulate a rotation value and attempt to determine when a 360, 540, 720, 900, 1080 has occurred and which direction (frontside/backside). The rider’s path and current position is graphically generated from the recorded latitude and longitude into a mini map. Speed, distance, and time stats also update on screen.

AntiMap Video is still an early working prototype at the moment, but I will be continuing development and improving before making it available for download. Application and source code will be released free under the Creative Commons Attribution-NonCommercial 3.0 (http://creativecommons.org/licenses/by-nc/3.0/). This is the first of a few free applications I plan on releasing for the AntiMap project which visualize logged data from the mobile application.

SNOWBOARDER’S & SKIER’S:
I am looking for testers to help create the first official AntiMap Video. I failed in my search for the perfect run at Mr Ruapehu (1 bluebird day at Whakapapa with no helmet strap for the camera and no park, followed by 3 days of whiteout at Turoa). I just recently left New Zealand, and won’t even be close to snow till at least next winter, so need some help! In short, I just want someone to film their perfect run landing a couple of spins off medium/large jumps whilst running the mobile application. If anyone is interested, drop me an email theantimap@gmail.com. All you need is an Android phone and a camera (preferably a POV helmet cam like the Go Pro).

I’m a little disappointed I was unable to find my perfect run and had to Frankenstein together videos to show different parts of the applications functionality. But overall I’m happy with the results of the tests.

UPDATE:
Applications and source code for iPhone & Android available: http://theantimap.com/

Interesting collection of best practices for CRS

Classroom response systems are often named as a popular way to integrate mobile technology in the classroom. As comprehensive collection of best practices and reflection on educational practices can be found at:

Set of references with underpinning research at: