Monthly Archives: November 2009

Apache Wookie and IMS Basic Learning Tools Interoperability

Wookie, Basic LTI, and  Blackboard Oh My!As the IMS Learning Tools Interoperability specification begins to be delivered in the shipping versions or available as plugins in BlackBoard 8 and 9, WebCT Vista, Sakai, Desire2Learn, Moodle, and ANGEL Learning, the inevitable question is “Where will the tools come from?”.
I personally have always seen the tools coming from three sources:

  • Companies that market add-ons to LMS systems such as Wimba, Learning Objects, Icodeon, Jenzabar, QuestionMark, Turnitin, and many others.
  • Publishers such as Pearson and McGraw-Hill also will take advantage of LTI to produce IMS Common Cartridges that include protected content and other services hosted on their servers.
  • Teachers, Students or others will write their own tools for their own learning purposes.


Wookie, Basic LTI, and D2L Oh My!I have always been fascinated with the third option because I think that it is where the enterprise LMS will start to shift from “Management” towards “Learning”. These new teacher-driven tools will be about learning – not about management. Often these teacher driven tools will be small and have very specific purposes such as a poll, simulation, game, or specialized interactive content.
The Apache Wookie effort (http://incubator.apache.org/wookie/) is building on the W3C widget spec in a way that makes it possible to develop and host simple learning Widgets. Scott Wilson of JISC is one of the Wookie project contributors. Scott added support to Wookie which makes it possible to create and access a widget instance using Basic LTI.
Widgets are small well-defined units of functionality that can be hosted by any widget server such as Apache Shindig and can be used directly in personal portals, desktops, or other environments in addition to LMS systems. Widgets are also evolving as a way to build portable applications across cellular telephone platforms.

Once Scott built his pre-alpha version of Wookie support for Basic LTI it was a simple matter to embed a W3C widget into Blackboard, Desire2Learn, WebCT Vista, and Sakai. The Blackboard support comes from a Building Block and PowerLink built by Stephen Vickers of Edinburgh University. Since the the platforms have tools that support Basic LTI already, it was a simple matter of plugging in URLs, keys, and secrets to make the four screen shots along the right side of this block where a the TENCompetence-developed chat widget is served from Wookie and placed in each of the LMS systems.
I am very excited to be starting to feel a slow and gentle shift in IMS Basic Learning Tools Interoperability from focusing nearly exclusively on getting vendors to support the specification to instead starting to think how we teachers will ultimately make use of the specification. As usual, Scott Wilson of JISC/CETIS as well as Rob Koper and the other folks at the Open University of the Netherlands are right there are the forefront of the movement. I am very excited to be at the starting point for this trend to take off.

Wookie, Basic LTI, and WebCT Oh My!If you too are interested in this trend, you might want to consider coming to the JISC Developer Days – February 24-27, 2010 in London. I expect that I will be there and Scott will be there as well – given the gathering potential and many of the threads of thinking coming together at this time, I think that this meeting will be one not to miss.

Moodle Implementation of Basic LTI Consumer is Underway!

Marc Alier and Jordi Piguiem-Poch have started the code to producer the Moodle Basic LTI Consumer.
You can follow along at this Google code site:
http://code.google.com/p/basiclti4moodle/
You can also see Marc’s LTI Video from last years IMS Learning Impact here:

This clever video explains how important it is that LMS vendors need to implement IMS LTI as soon as possible!

Basic LTI and OAuth in PERL (Snippet)

David Pang came up to the Duderstadt Center this afternoon and we hacked up the beginnings of a perl Basic LTI Tool Producer. This is the “hello world\n” version :)

#!/usr/bin/perl
use Net::OAuth;
$Net::OAuth::PROTOCOL_VERSION = Net::OAuth::PROTOCOL_VERSION_1_0A;
use CGI;
my $q = new CGI;
print "Content-type: text/html\n\n";
my $request = Net::OAuth->request("request token")->from_hash({$q->Vars},

High Performance Computing Book – Republished on Connexions

Thanks to my publisher (O’Reilly and Associates) and my editor Mike Loukides, and the Connexions team, (Daniel Williamson, Kathi Fletcher, and Jan Odegard) my “High Performance Computing” book has been republished on Connexions (www.cnx.org). The book is freely available on the web and in PDF and on-demand printable for the cost of printing and shipping. The entire contents of the book is CC-BY license so it can be remixed and republished by anyone who wants to adapt the book.
I like to think that this is an example of how the long-term lifecycle of a book should work. Authors and publishers invest resources to make a lovely book and they get to reap some benefits for the book for some limited time and then the the mainstream value of the book drops to the point where it is no longer viable – the book goes into the “public domain” to allow an infinite number of long-tail uses of the material for as long as the material is useful.

Here is the Book’s new home at Connexions: http://cnx.org/content/m32709/latest/

Here is a press-release about the book from Connexions: http://www.hpcwire.com/offthewire/HPC-Open-Education-Cup-Winners-Announced-70036597.html

Here is the introduction to the Connexions edition of the book:

The purpose of this book has always been to teach new programmers and scientists about the basics of High Performance Computing. Too many parallel and high performance computing books focus on the architecture, theory and computer science surrounding HPC. I wanted this book to speak to the practicing Chemistry student, Physicist, or Biologist who need to write and run their programs as part of their research. I was using the first edition of the book written by Kevin Dowd in 1996 when I found out that the book was going out of print. I immediately sent an angry letter to O’Reilly customer support imploring them to keep the book going as it was the only book of its kind in the marketplace. That complaint letter triggered several conversations which let to me becoming the author of the second edition. In true “open-source” fashion – since I complained about it – I got to fix it. During Fall 1997, while I was using the book to teach my HPC course, I re-wrote the book one chapter at a time, fueled by multiple late-night lattes and the fear of not having anything ready for the weeks lecture.
The second edition came out in July 1998, and was pretty well received. I got many good comments from teachers and scientists who felt that the book did a good job of teaching the practitioner – which made me very happy.

In 1998, this book was published at a crossroads in the history of High Performance Computing. In the late 1990’s there was still a question a to whether the large vector supercomputers with their specialized memory systems could resist the assault from the increasing clock rates of the microprocessors. Also in the later 1990’s there was a question whether the fast, expensive, and power-hungry RISC architectures would win over the commodity Intel microprocessors and commodity memory technologies.

By 2003, the market had decided that the commodity microprocessor was king – its performance and the performance of commodity memory subsystems kept increasing so rapidly. By 2006, the Intel architecture had eliminated all the RISC architecture processors by greatly increasing clock rate and truly winning the increasingly important Floating Point Operations per Watt competition. Once users figured out how to effectively use loosely coupled processors, overall cost and improving energy consumption of commodity microprocessors became overriding factors in the market place.

These changes led to the book becoming less and less relevant to the common use cases in the HPC field and led to the book going out of print – much to the chagrin of its small but devoted fan base. I was reduced to buying used copies of the book from Amazon in order to have a few copies laying around the office to give as gifts to unsuspecting visitors.

Thanks the the forward-looking approach of O’Reilly and Associates to use Founder’s Copyright and releasing out-of-print books under Creative Commons Attribution, this book once again rises from the ashes like the proverbial Phoenix. By bringing this book to Connexions and publishing it under a Creative Commons Attribution license we are insuring that the book is never again obsolete. We can take the core elements of the book which are still relevant and a new community of authors can add to and adapt the book as needed over time.
Publishing through Connexions also keeps the cost of printed books very low and so it will be a wise choice as a textbook for college courses in High Performance Computing. The Creative Commons Licensing and the ability to print locally can make this book available in any country and any school in the world. Like Wikipedia, those of us who use the book can become the volunteers who will help improve the book and become co-authors of the book.

I need to thank Kevin Dowd who wrote the first edition and graciously let me alter it from cover to cover in the second edition. Mike Loukides of O’Reilly was the editor of both the first and second editions and we talk from time to time about a possible future edition of the book. Mike was also instrumental in helping to release the book from O’Reilly under Creative Commons Attribution. The team at Connexions has been wonderful to work with. We share a passion for High Performance Computing and new forms of publishing so that the knowledge reaches as many people as possible. I want to thank Jan Odegard and Kathi Fletcher for encouraging, supporting and helping me through the re-publishing process. Daniel Williamson did an amazing job of converting the materials from the O’Reilly formats to the Connexions formats.
I truly look forward to seeing how far this book will go now that we can have an unlimited number of co-authors to invest and then use the book. I look forward to work with you all.

Charles Severance – November 12, 2009

Standards Experience – IMS Basic Learning Tools Interoperability

I have been working on tools interoperability for learning management systems in one form or another since 2003. One of the first deliverables for the Sakai Project in 2004 was a document called the “Tool Portability Profile” (TPP) – this mythical document was going to explain how one would write a tool to plug into a learning management system. When we proposed this notion, the idea was to make it so that some developer would get a copy of the TPP and they would be off and running writing their tool to extend Sakai with no further training required.

Part-way through 2004, through some interaction between the Sakai Project Board, Blackboard, and the Mellon Foundation, we formed a working group in IMS called “Tools Interoperability 1.0”. In the early meetings, I proposed a very Java-centered specification that was an approximation of the (still nonexistent) Sakai TPP. No one seemed to like it and most discussions went in circles until Chris Vento came in and suggested that we switch to web services modeled on WebCT PowerLinks. I quickly saw the benefit of his approach and immediately switched my support to Chris’ approach.

IMS Tools Interoperability 1.0 was completed and we did an awesome demo at the 2005 Alt-I-Lab meeting in Sheffield England. We had working code from Sakai, WebCT, Blackboard and Moodle.

VIdeo / Pictures

IMS TI 1.0 ended up not being used in the market because of a combination of solving too simple of a use case – but using technology that was too hard to use. In a sense, developing the implementations for the Sheffield demo made it pretty clear that it just was pretty hard to get IMS TI 1.0 in production.

But the notion of coming up with a single standardized way to plug tools into learning management systems was something that was still highly valuable and something we felt that we needed to build a standard around. So we founded IMS Learning Tools Interoperability (note the new “L” character in LTI).

IMS Learning Tools Interoperability was lead by Bruno Van Haetsdaele of Wimba. Wimba had taken an approach for their Blackboard integration to create a very thin building block that effectively “remoted” the common building block operations and used simple REST-based web services to forward the building block operations to their Wimba servers where they had much better control and flexibility. IMS LTI was initially set up to replicate this “remote building block using REST web services” as designed by Wimba . In addition we were going to look at Facebook’s API and iGoogle to insure that our interfaces would be easy enough to use to have people who would actually “like” using them. My personal goal was that it work in PHP – and not just Java and .NET.

Wimba took the lead in LTI and started building both software and the specification in the working group. Mark Ritter was hired to really push this forward. At one meeting, I saw a demo of working code that Wimba had written and I became interested enough to start building a Sakai implementation of the draft specification. By February 2008, things were really starting to get nailed down – I had taken several trips to Wimba HQ in New York City to do intense development sessions.

In March 2008, I proposed a Google Summer of Code project to get us
additional resources to crank up the project and get us over the hump.
About the same time, Blackboard showed the group their upcoming Proxy tool specification and offered it to the working group as a starting point for LTI. It was quite different than the approach we had taken in the Wimba version – but it was far more mature in terms of its scope. We did not have to switch to the BB9 Proxy approach – but after some thought it was really clear that we would gain more than we would lose by switching to the BB9 proxy approach. So we switched and went back to the drawing board in many ways.

Pearson had also been telling the rest of IMS about their TPI integration approach which was a REST-based launch and IMS-Enterprise inspired roster provisioning. While we were getting close to completing the Wimba led design for LTI, TPI seemed like it was too complex to bring into LTI. As we switched to the BB9 proxy approach, it was a great opportunity to align the IMS LTI with Pearson’s TPI.

During this transition, I had been awarded two students for the Google Summer of Code in 2008. But all of a sudden I no longer had a mature specification. So I quickly whipped up a non-formal specification which I called SimpleLTI which was a mashup of the ideas in the Wimba variant of LTI and the new Blackboard BB9 proxy specification. I effectively had to write a specification in a week so I could had it to my new students Katie Edwards and Jordi Piguiem-Poch who would be writing Sakai and Moodle implementations respectively.

I spent the summer of 2008 working with my SimpleLTI spec and my students. The rest of the LTI working group led by Bruno van Haetsdaele of Wimba, Lance Newmann of Blackboard, and Greg McFall of Pearson started going through the BB9 proxy approach, the Wimba approach, and TPI approach and unifying the approaches. For example, Bruno brought us the OAuth specification as a way to securely sign messages – which was great because my SimpleLTI spec (based on the Wimba version of LTI) had a flaw because I was not an expert on message signing approaches. Using OAuth let Google, Twitter and Facebook get message signing right and we could just use it.

By the end of Summer 2008, SimpleLTI was a pretty good spec – it was simple, clear and fun to use and write to. We have solid Java, PHP, and C# implementations. I had added production-ready code to Sakai for the tool consumer, Katie had build a series of nifty features turning Sakai into a SimpleLTI Tool Producer and Jordi had build a Moodle Tool Consumer.
We also decided to do a bunch of cool demos using Simple LTI for 2008 Educause with ANGEL Learning, Pearson, McGraw-Hill, and Microsoft. The demos were awesome – they demonstrated the potential power of the connection between IMS LTI and the IMS Common Cartridge specification. It showed a very simple and elegant way to have cartridges which referenced high-quality (i.e. not free) content in a cartridge – without having to include the content in the cartridge.

For me, I always knew that SimpleLTI was only a *temporary* specification and that at some point I needed to “kill” SimpleLTI and switch my efforts to the “Real LTI” under development by Bruno, Greg, and Lance in the working group. After the Educause demos, i pretty much decided that SimpleLTI was done and that all of my future work needed to work toward the ultimate real specification.

So I dove into the real LTI specification and I saw that it was a *lot* more complex than either the Wimba Design or the SimpleLTI design. It had a lot of great technology inside of it but it was way too big to be “fun” to implement. So my first reaction was to go through an exercise to “subset” the LTI specification and pulling out the particular elements of LTI that mapped to the scope of the SimpleLTI specification – and I called this notion “Basic LTI”.

As part of this exercise, I was finding places where SimpleLTI had a different approach than LTI. Since I had been deep in SimpleLTI, I always assumed that SimpleLTI was right and LTI was wrong. Lance Newmann and Greg McFall patiently talked me through each of the conflicts and educated me about the right way to model data and model interactions.

One by one, we walked through the differences and the Basic LTI subset of LTI (we now called this “Full LTI” in the working group), we ended up with a very nice specification that had a nice architecture and a spec that was easy to implement.

After I had taken all of the input and developed what I figured was the final version of Basic LTI, we then we back-pushed all the concepts into LTI so that Basic LTI would be a perfect subset of LTI when LTI was finally released. Since I am impatient, I always am ready to ship when something is “good enough” – but with Lance and Greg – “good enough” was simply not good enough. It had to be elegant. And they made Basic LTI elegant.

Now that Basic LTI is in internal draft in the IMS Global Learning Consortium and members are quickly building compliant and interoperable implementations and opening up a new market for interoperable tools hosted outside of LMS systems, I just want to publicly thank my friends and mentors that have helped me and taught me so much on this journey:

  • Bruno van HaetsDaele (Wimba)
  • Greg McFall (Pearson)
  • Lance Neumann (Blackboard)

As first the industry, and then the end-user teachers and learners benefit from the several years of dedicated engineering that went into IMS Learning Tools Interoperability – I just want to make sure to recognize and thank the people who never gave up on the idea – meeting weekly for nearly two years – to give this great gift to us all.

Abstract: Basement 414 Lecture Series

The Basement 414 team is interested in doing some educational activities in addition to the music and art that goes on in the venue. I submitted the following draft abstracts.
The Making of the Internet and World-Wide Web
Today, we take the Internet and World-Wide Web for granted as if has been around forever. The Internet was created in the 1980’s and the World-Wide-Web was created in the 1990’s. Even though it is hard to imagine life today without these technologies, in a way, both were almost accidents of history and it actually took a bit of luck for these technologies to find their way out of research labs and into general use across society. In this talk we take a particular look at the moments where creative and visionary people made decisions which helped shape the Internet and World-Wide web. We also look into the reasons why the Internet and World-Wide-Web might never happen happened. The talk also includes a number of short video interviews with some of the innovators of the Internet and World-Wide Web.
Open Content, Open Software, and Creative Commons
Everything that you create such as art, music, and software is covered by Intellectual Property laws. In order to protect creative works, the laws are very conservative and greatly limit any reuse of the works unless the creators of the work indicate their intentions with respect to reuse of their works. In this lecture we look at the various licenses from the Creative Commons, Apache Foundation, and Free Software Foundation. These licenses make it possible for the creators to easily indicate their intentions with respect to their works and lay the foundation for people to legally make use of each other’s materials to create new works and do so in a legal and respectful manner. This lecture will include video interviews of the founder of the Apache Software Foundation and Free Software Foundation.
Dating, Game Theory, and the Nash Equilibrium
Game Theory is a branch of Economics that explains how people interact and make choices in competitive and cooperative situations. Game Theory can explain who dances with whom in a bar and why people find themselves in bad situations with no way to get out of the situation. We can use Game Theory to analyze games and find if there is a sure way to win and if it is possible to predict your opponents’s behavior to your advantage. This lecture will cover the “Nash Equilibrium” which is one of the foundations of Game Theory and won the Nobel prize in Economics in 1994 and was featured in the 2002 movie, “A Beautiful Mind”. We will explain game theory and the Nash Equilibrium by looking at scenes from “A Beautiful Mind”. There are no pre-requisites for this lecture and there is no math in this lecture.
Bio:
Dr. Charles Severance is a Clinical Assistant Professor in the School of Information at the University of Michigan. He teaches courses on the Internet, technology, and programming at the University of Michigan. His research area is in novel ways of using technology to enhance teaching and learning. Dr. Severance was the co-host of a television program called Internet:TCI which was distributed nationally by TCI Cable Television during the mid-1990’s. Dr. Severance also was a guest on the call-in program on WKAR-AM for many years talking about Internet and Technology. Dr. Severance has a B.S., M.S. and Ph.D. from Michigan State University.

Sakai Basic LTI Producer is Up on Nightly

Updated November 8: The code is now in pretty good shape and ready for testing
Thanks to Anthony and Chris we have the latest Sakai Basic LTI up and running on nightly2 for testing. This allows Sakai tools to be dropped into other LMS systems that support the Basic LTI Consumer.
This should end up in Sakai 2.7 as part of that release and can be back-ported to earlier versions of Sakai if needed.
http://nightly2.sakaiproject.org:8085/imsblti/producer/sakai.rwiki
Key: lmsng.school.edu
Password: secret
You can put any Sakai tool registration ID on the end of the URL replacing “sakai.rwiki” – I also need to run through tests of lots of tools to make sure they all work – rwki does work.
If you want to check out the producer code yourself, it is here:
https://source.sakaiproject.org/contrib/ims/basiclti/trunk/
Drop it into a 2.6 or trunk of Sakai and add the following to your sakai.properties:
imsblti.producer.enabled=true
imsblti.producer.allowedtools=sakai.announcements:sakai.singleuser:sakai.assignment.grades:blogger:sakai.dropbox:sakai.mailbox:sakai.forums:sakai.gradebook.tool:sakai.podcasts:sakai.poll:sakai.resources:sakai.schedule:sakai.samigo:sakai.rwiki
imsblti.producer.lmsng.school.edu.secret=secret
webservices.allow=.+
Please send me any bugs you find including log messages. You can see the nightly Sakai logs at this URL:
http://nightly2.sakaiproject.org/logs/tomcat-contrib
You need to refresh and scroll to the bottom to see new entries – yes – it is not a tail – but it is simple and works.
Here is the documentation for the producer: https://source.sakaiproject.org/contrib/ims/basiclti/trunk/basiclti-docs/sakai_basiclti_producer.doc
Here is a simple test plan using my PHP fake LMS:
Testing:
http://www.dr-chuck.com/ims/php-simple/lms.php
Make the URL be
http://nightly2.sakaiproject.org:8085/imsblti/producer/sakai.rwiki
Set LMS password to be “lmsng.school.edu”
Set the LMS secret to be “secret”
Press Submit
Scroll down and Press Launch
You should see the Sakai Wiki.
You can switch sakai.rwiki to sakai.poll or whatever.
Feel free to send me E-Mail on bugs you find.