Power Outage Notes

These are my notes for how to handle a power outage when I am running my house on generator. I have a small 2500W generator with a 40 minute gas tank and 1000W inverter for the Prius – or for more extended situations I borrow a 7000W generator with an 18 hour gas tank.

I have built two 110V back-feed adapters by taking a cheap power strip cord, cutting it and adding a male end plus a pulled out loop to read amps using a PYLE Meters PCMT20 Digital AC/DC Auto-Ranging Clamp Meter. It is nice to be able to check the amp draw in each circuit as you balance. It is especially nice to catch those startup loads (i.e. like laser printers).

Here are my notes:

Back feed outlets
– #1 Dining room to the right of the back door (light load items + Sump Pump + garage)
– #2 Laundry Room (heavy load items – furnace, HW heater, kitchen fridge, basement freezer)

What to turn off / unplug:
– Washer and dryer
– Laser Printers (these pull a massive surge when they start)
– Sump pump (110V) – leave 12V backup sump charger plugged in (on #1)
– Basement freezer (#2)
– Garage refrigerator (#1)

Note: I don’t have any UPS / major surge protectors – but I would unplug those if I had them. I don’t bother to unplug my power strips.

Small Generator (2500W) scenario:
– #1 runs from Prius with inverter – use only LED lights
– #2 runs from 2500W generator – alternate between

LED Lights:
– Cabinet top and bottom
– Island down lights (2)
– Piano top
– Living Room between sofa and love seat
– Master Bath down lights
– Office desk lamp
– Mud room

Duplicating a Mono (or Single Channel) Track into Stereo in Audacity

This is a really simple set of steps to duplicate one channel of a stereo recording to the other channel, but for the life of me, I cannot ever seem to remember it so I am going to write it dow so I can refer back to it.

My problem is that I use a Zoom H4N Portable Digital Recorderand high quality microphone to record my IEEE Computer Computing Conversations podcasts and so I end up with a single channel in a stereo track with the other channel completely silent.

Here are the steps to duplicate the track with sound to the other track so both channels are identical:

From the track drop-down by the file name "Split Stereo Track

Select the entire upper track by double clicking and Copy (i.e. Command-C)

Move the cursor to the very beginning of the empty track and Paste (i.e. Command-V)

Select both the top track and bottom track (Using click on one track shift click on the other track or Command-A)

From the track Drop-down by the file name select "Make Stereo Track"

Surprisingly simple yet hard for to remember if I do it only once per month.

This web site helped me figure it out in the first place – http://chacadwa.com/foh/mono-to-stereo-in-audacity

Using SSH Tunnel to make my Laptop Appear to have a Real Internet Address

Sometimes I need to have a development version of a Sakai running with a real IP address on the real Internet. For example to run the IMS Learning Tools Interoperability certification I need to have my Sakai accept web service callbacks form the IMS server.

One way to do it is to run an Amazon EC2 instance but this is a pain. I need to login a bunch of times to get all the windows I want and I cannot use any desktop apps. And I need a big EC2 instance and need to install a bunch of stuff – yada yada – and I have all that stuff already on my desktop.

So here is the procedure.

First make the smallest possible Amazon EC2 Instance. Make sure to open port 8080 on your instance. Log in to the instance and edit the sshd configuration file:

sudo vi /etc/ssh/sshd_config

Add or uncomment a line that says this:

GatewayPorts yes

Then restart sshd

sudo /etc/init.d/ssh restart

Then log out of the server and set up your SSH tunnel:

ssh -i .ssh/zzz.pem -R 8080:localhost:8080 ubuntu@ec2-23-22-200-200.compute-1.amazonaws.com

You can double check that your server is accepting connections from the outside world:

netstat -ntl

Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State
tcp 0 0 127.0.0.1:3306 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN
tcp6 0 0 :::22 :::* LISTEN

In Sakai, you add the following to your sakai.properties:

serverUrl=http://ec2-23-22-200-200.compute-1.amazonaws.com:8080

Once your Sakai is up you should be able to navigate to

http://ec2-23-22-200-200.compute-1.amazonaws.com:8080/portal

And viola – everything is running on your laptop but the server is on the internet with a real address.

Update: What you need to do for DigitalOcean

Blog Post: https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-tunneling-on-a-vps

Sakai Setting:

serverUrl=http://do1.dr-chuck.com:8080

Code to restart SSHD:

service ssh restart

SSH Call:

ssh root@do1.dr-chuck.com -R 8080:localhost:8080

Procrastination, Late Days, Special Exceptions – Dog Ate My Homework

My Coursera class has started a thread about wanting extensions on assignments. I don’t have complex late day policies- I publish a deadline and don’t move it.

Here is my post to the thread:

I will be honest and tell you that there is no policy that I can come up with that will make everyone happy. Last summer I taught the class and had late days. People would use up all the late days in the first two weeks and then threads like this would start about “we need more late days”, “the software is broken and misleading” , or “late days are a mess – can I have four more?” etc etc. I have agonized over this a lot and this is what I conclude:

Some people procrastinate and some people do not procrastinate. Procrastinating is not bad (I do it all the time). But when you procrastinate – you add risk. The only way anyone misses a deadline by 10 minutes is to have carefully calculated the latest possible time to do the quiz and then something turned out wrong. If the student did the work a day earlier – the time zone or a small network failure or the need to go pick the kids up from day care does not lead to a late assignment.

Those who procrastinate will take any late days and just add that to the deadline and start working right before the deadline + late days – and then some thing goes wrong and they still need an extension.

People who don’t procrastinate – don’t need late days. They have a little extra time built in to cover for little things in the software or important things that take priority in their life.

But I am not 100% anti-procrastination – as I said I do it all the time. If you miss the deadline on a quiz or two – just let it go and keep up from that point on. In the previous three times I have taught the course, one or two late quizzes has *never* been the reason that folks don’t earn a certificate. Those that don’t earn a certificate – miss 50% of the class usually.

At the end of the class I do several pre-calculations of the overall grades and look at those who are “close” – sometimes I adjust the grading if it appeared that a bunch of students who worked hard during the entire class and missed a few points – I will change the grading scheme a bit.

So instead of advocating for adjustments – just make sure the rest of the quizzes are done on time. I assure you things will work out.

I have learned a few things in 30 years of teaching and one of them is that if I start giving individual students exceptions, then it simply means that those students will need even more exceptions in the future. If you can tell me a story about the dog chewing on your cable modem and get an extra week and then word gets around the class – you would be surprised at how many dogs all of a sudden start eating cable modems.

So I just keep it simple with no exceptions and then take a close look at all before I award grades at end of the semester.

Comments welcome.

Copyright Text for my Python Book

I just found out that last remaining barrier to publishing my Python for Informatics book on CreateSpace has been resolved – so I am furiously preparing to publish the book.

Of course at the moment of publishing, one must decide which copyright license to use on the book.

My travails, fears, concern, and angst that I feel around the use of CC-BY for some of my most precious bits of IP are documented here, here, and here. So I don’t want to use CC-BY. And CC-BY-SA does not fix the problem of unscrupulous print or spam-farm re-publishers, so I find myself pushed into CC-BY-NC-SA. Adding the NC gives me a better chance to force takedown of scumbag no-value-add print republishers but blocks certain uses that I explicitly do not want to block.

So I have added the following text to an appendix of the book in hope that I can walk a fine path and do what I want. I am inspired by the CC0 technique of recording the author’s intentions. This stops short of the CC1 idea I proposed in an earlier blog post when I was really angry. It really is CC-BY-NC-SA in most situations and CC-BY or CC-BY-SA in some specifically listed situations. I am curious to know if it works.

So here is the text of my copyright appendix from the new book. Comments and review welcome.

Copyright Addendum

This work is licensed under a Creative Common Attribution-NonCommercial-ShareAlike 3.0 Unported License.

I would have preferred to license the book under the less restrictive CC-BY-SA license. But unfortunately there are a few unscrupulous organizations who search for and find freely licensed books, and then publish and sell virtually unchanged copies of the books on a print on demand service such as LuLu or CreateSpace. CreateSpace has (thankfully) added a policy that gives the wishes of the actual copyright holder preference over a non-copyright holder attempting to publish a freely-licensed work. Unfortunately there are many print-on-demand services and very few have as well-considered a policy as CreateSpace.

So I regretfully added the NC element to the license this book to give me recourse in case someone tries to clone this book and sell it commercially. Unfortunately, adding NC limits uses of this material that I would like to permit. So I have added this section of the document to describe specific situations where I am giving my permission in advance to use the material in this book in situations that some might consider commercial.

  • If you are printing a limited number of copies of this book for use in a course, then you are granted CC-BY license to these materials for that purpose.
  • If you are willing to translate this book into a language other than English, contact me so I can grant you a a CC-BY license to these materials with respect to the publication of your translation. In particular you are permitted to sell the resulting translated book commercially.

Of course, you are welcome to contact me and ask for permission if these clauses are not sufficient. In almost all cases permission to reuse and remix this material will be granted as long as there is clear added value or benefit to students or teachers that will accrue as a result of the new work.

Charles Severance
www.dr-chuck.com
Ann Arbor, MI, USA
September 7, 2013

P.S. Many thanks to Lauren Cowles of Cambridge Press and Alan B. Downey for helping me with the IP issues with this book over the past few years.

Teaching a ROOC – Re-Mixable Open Online Course

Note: Since this post mentions Blackboard, I want to remind everyone that I do work for Blackboard as the Sakai Chief Strategist in addition to working as a faculty member for the University of Michigan. The statements in this post are my own and do not reflect any official Blackboard positions or directions.

I have been having great fun teaching my Internet History, Technology, and Security (IHTS) MOOC using Coursera. The class is now on its third iteration with about 120,000 students total across all three sections of the course and about 12,000 students earning certificates. I love how the large-scale of these courses allows me to touch so many people around the world. I can hold office hours in any large city in the world and meet and interact with my students. This is a video filmed by one of my students (Nico Morrison) where he captured about 15 minutes of the actual conversations that happen in a typical office hours:


London Office Hours Video (Extended edition) (13:39)

I hope more students bring cameras to my office hours – the videos I make for the office hours are more in the “hello world” style:

 

But for me the “Massive” in MOOC is only part of the story. Over the past decade, I have spent a lot of time pushing for Open Educational Resources (OER), and standards and tools to advance the cause of OER materials in education.

A Focus on Remixability – Python for Informatics

In order for me to “walk the walk” of OER and experience the real issues and challenges in remixable OER, I have another course where I am pushing on the Open Educational Resource and focusing on pushing on the “Remixability” dimension. My course based on my Python for Informatics: Exploring Information CC-BY-SA licensed text book is supported by CC-BY course materials and Apache licensed software.

Here are my OER Materials for Python for Informatics

These materials include the book, all slide sets, all video lectures, and the auto-grading software. I provide the materials both in a kit of files and instructions and as course exports from Moodle, Blackboard, and IMS Common Cartridge.

An Exercise in Teaching on Multiple Platforms

As an exercise in proving that the materials are indeed reusable I have taught my class using the materials on Moodle on Dr. Chuck Online and on Blackboard’s free hosted CourseSites. platform. I also put the materials up as a Python course in Peer-to-Peer University.

Here is what the courses look like in Moodle 2.5 (with the new pretty default skin) and in Blackboard’s CourseSites:

None of these offerings has attracted massive numbers of students. I have about 3000 students enrolled between the three courses. The courses follow a self-paced model so there is less time pressure for students and less stress for me. Students complete the course and earn a badge.

My goal was to prove that I could quickly construct a course on a new system reusing the materials. I was able to put up the CourseSites instance of my course over couple of days and was able to put up my P2PU course up in about a day. The P2PU class was easier to put up because there is no auto-grading – all the grading is done by peers in P2PU.

Building a Learner Community Around Remixed Content

One of my other dreams is that we would create communities of interest around learning content. I would love to have a forum / mailing list that brought all of the people interested in using my Python for Informatics book and materials together where we could exchange ideas and remixed materials with each other.

I have not yet started to build such a community, but I am using Blackboard’s xpLor product to connect the two instances of my Python course running on Dr. Chuck Online (Moodle) and CourseSites (Blackboard Learn). Here are screen shots of that cross-LMS forum hosted in xpLor linked from each of the courses:

This forum linkage is using xpLor and IMS Learning Tools Interoperability and was the subject of my recent presentation at Blackboard’s BbWorld 2013 conference.

A Community of and For Teachers

One of my core beliefs is that for remix-ability to work we need places for teachers to find, exchange, remix, and publish materials. This requires both interoperability standards like IMS Common Cartridge as well a sites like YouTube where people can congregate.

Blackboard has put a lot of effort into Learn to support standards like IMS Common Cartridge and IMS Learning Tools Interoperability and has added nice features to CourseSites to make publishing a CourseSites course as open content. Once I had built my course in CourseSites, there is a control panel option to make my materials freely available on the web. Here is a screen shot of what it takes to publish a Blackboard backup and IMS Common Cartridge from CourseSites:

Every instructor on CourseSites has their own page. Mine is drchuck.coursesites.com and all the open content that each of us publishes is automatically made available on each course page linked from the Instructor page:

Python for Informatics on CourseSites

Scroll to the bottom of this page to see the links to the open content that was automatically produced by CourseSites.

Some Reflection

So I have this open, free and re-mixable course. Students can take the course and leave with the materials to use in their own courses. What next?

Well the first observation is that not too much has happened. At this moment I think that the urge for teachers to remix each others materials is still a pre-emergent concept. It does not have a “viral factor” where people tell each other about the idea so that it experiences rapid super-linear growth (look at the stats).

I have never been too upset working hard in an area that is still pre-emergent. At times, you do get the sense that no one cares and that no one will ever care. I got that feeling a lot when I was working on IMS Learning Tools Interoperability from 2004-2010 – but things turned out nicely on that front. So I will just keep pushing forward wherever and however I feel I can push to advance this cause with technology and standards.

Does A Person Need to Sign a new Contribution Agreement When They Change Jobs?

It has recently come up in discussion in the Apereo Foundation Licensing discussion group whether new contribution agreements are needed when an individual gets a new job like my additional job at Blackboard or in the case where rSmart’s Sakai operation became part of Asahi Net. The discussion centers around these two documents:

Sakai Individual Contributor License Agreement 1.0.1 (i.e. ICLA) (cached copy)

Sakai Software Grant and Corporate Contributor License Agreement v1.1.1 (i.e. CCLA) (cached copy)

These documents were based on the Apache Individual Contributor License Agreement (ICLA) : (2004 version) and (current version) (there were inconsequential text changes on the Apache document between 2004 and the present – but the version on the Apache document was not changed)

Some in the discussion were pointing out that when a person changes jobs and they have an ICLA on file with the project that they do not need to sign a new ICLA. Many of us with long experience point out that this is not the point – even if some lawyer reads the words literally and informs us that there is nothing in the wording of the above documents that forces individuals or corporations to file a new ICLA or require a CCLA for their new company – it is in the best interest of the project to file new CCLA and ICLA documents.

That technical/legal argument misses the whole point. This is not about doing the minimum that is legally required – instead we should take steps to insure that new ICLAs and a new CCLA is filed to best protect the projects’s interests.

This is my more detailed response in that mail thread.


All the documents definitely mention the employer(s) and address the “employer(s) permission” in their section 4:

You represent that you are legally entitled to grant the above license. If your employer(s) has rights to intellectual property that you create that includes your Contributions, you represent that you have received permission to make Contributions on behalf of that employer, that your employer has waived such rights for your Contributions to the Foundation, or that your employer has executed a separate Corporate CLA with the Foundation.

This is a very important clause. It does not require a signature from your “boss” on the ICLA and does not require a Corporate CCLA – but requires diligence on the part of the employee to obtain permission and assure continuous monitoring of anything that might change the permission as requested in section 7:

You agree to notify the Foundation of any facts or circumstances of which you become aware that would make these representations inaccurate in any respect.

It is almost 100% likely when an individual changes jobs that their ICLA needs *review* as they will have signed an IP arrangement with their new job unless the new job is distinctly non-technical – like you switched from a job doing software development to being a taxi driver.

In order for the employee to comply with section 7 of the ICLA the *most-absolutely-squeaky-clean* way to do this is to (a) have the new company file a CCLA explicitly listing the individual(s) involved and (b) have the individuals re-affirm their ICLA by re-submitting a new one. This is what I did when I joined Blackboard.

Some *lawyer* might advise us that we *technically* are not forced to follow the safe path by literally reading the words in the contracts – but I would then tell that lawyer “thanks for your advice” and follow the safe path. A wiser lawyer likely would say “you don’t have to demand a CCLA – but particularly when a technology company is involved your best approach is to get the CCLA and new ICLAs to cover all the bases if it ever came up in court…”.

If for example, if I went to work for Blackboard and I asked them to sign a CCLA and they refused to sign a CCLA, then I as an individual would need to inform the foundation (under section 7) that my ICLA was no longer valid as my conditions had changed. The whole structure pre-supposes that the individual is carefully monitoring and protecting the foundation’s interest across changes in their employment situation.

As another scenario, if a person working for a university was reorganized into a new unit and was explicitly told by their supervisor that they were *prohibited* from working on Sakai – it would also (in my opinion) trigger section 7 and makes the ICLA invalid – and so to protect the foundation the individual should stop making commits. And whether or not a rent-a-lawyer would say there are no words that insist that the ICLA is invalid – if an individual continues to contribute under cloudy IP conditions puts the foundation at grave risk.

So this is about not just the letter of the contract – but also an individual’s personal commitment described in section 7 within those contracts to protect the foundations interests as their job situations change. We as individuals need not to think about the least we are legally required to do – but instead what is best for the long-term health of the foundation. So even if a lawyer says we don’t need a new ICLA and don’t need a CCLA when we get a new job, as individuals we should want to go beyond the technical minimum and give the foundation maximum protection.

Report from the First Apereo Conference

I really enjoyed attending the first Open Apereo 2013 conference in San Diego June 2-7, 2013.

The mood of the conference was very excited, positive, and upbeat. For the past two years the Sakai and JASIG communities have been gently aligning in anticipation of Apereo. But now the waiting and anticipation is over and all the energy that went into planning is going back into our existing and new projects. It seemed that for a few years, both organizations put a lot of forward-looking thinking on hold to focus on the merging. But now nothing is on hold and it felt like new ideas and new directions for our new community were little popping up every day. Being able to make progress on these pent-up project ideas was very freeing.

For me the founding notion of Apereo was that the more rich and diverse our community became – the more solid and sustainable it would be. With Apereo we could focus on anyone in higher education interested in openness (and not just software) rather than limiting our scope to our historical beginnings. It is a very freeing feeling. We are no longer bound by the (wonderful and amazing) historical accidents that brought the Sakai and JASig communities to life. We owe a great debt of gratitude to thanks Ira Fuchs, Carl Jacobson, and Joseph Hardin as the initial founders of our projects.

But now, ten years later, we have re-founded ourselves in a careful and thoughtful manner and informed by over a decade of experience making open source happen. We need to thank those who gave so much to make this merger a reality. This was two+ years of hard work where a number of people learned far more about non-profit laws than you could imagine. Building something good takes time – but a lot of people are very relieved to have it finished so we can look to the future.

People who stick out for me a leading the merger effort include: Patty Gertz, Ian Dolphin, Josh Baron, Jens Haeusser, Robert Sherratt, John Lewis, Seth Theriault, and both of the board of directors of Sakai and JASIG as well as the transition committee made up of members from both boards. I was on the Sakai Board and part of the transition committee but my own contributions were small compared to those who were the real leaders of the effort. It was a long and winding road – and the only way to move forward was to be patient and do it right because we really only had one chance to get the founding principles of the Apereo Foundation right.

Sakai in a Apereo-Foundation World

The Sakai-related efforts that are now part of Apereo are now so much better positioned to make forward progress. In the Sakai Project and Foundation – these ideas were often too intertwined to make forward progress. We spent too much time trying to come up with one set of priorities across all our efforts that distracted from moving our efforts forward. Here are my observations:

  • The Apereo Open Academic Environment has renamed itself to emphasize that the OAE is very much an independent project exploring next generation approaches to teaching, learning, and collaboration. The OAE team has rewritten much of the core software since the end of 2012 and is moving quickly to a version 1.0 sometime this summer running in production for Marist, Georgia Tech, and Cambridge. Getting a 1.0 project into production is a wonderful milestone and will likely re-kindle interest in the OAE project, growing their interest and resources. Some might say that OAE died and has been reborn – I actually disagree with this notion – OAE has been on a path all along and there were certainly bumps on that path – as the bumps smoothed out the project is moving toward a 1.0 release nicely.
  • Teaching and Learning SIG – Because this is now an independent entity within Apereo it is a natural place to look across the Sakai CLE and OAE as well as looking at emerging efforts (below). The T/L group also will continue the TWISA (Teaching with Sakai Innovation Awards) and look to expand the effort. This group serves as a natural gathering point for the faculty and student interest in applying the ideas of openness to teaching and learning. I think that this group will make sure that the end-users of our software have a place at the conference. I also think that this group can nurture interest in areas like Open Education Resources (OER) and if there is an interest in developing practice or software for OER – Apereo might be a great place to incubate that work.
  • The WAD Portfolio Effort – Thanks to efforts like Janice Smith, Shoji Kajita, Jacques Raynauld, and many others, there is continued interest in portfolio solutions in open source. The current effort is a pre-incubation group working together on a product they call WAD (I don’t know what it stands for). The idea for WAD is to build a portfolio system outside of the LMS and find ways to do a deep integration to pull out LMS data as needed. In many ways WAD feels like a throwback to the OSP 1.0 times where practicing pedagogists kept themselves very close to the emerging development efforts and gently steered the process. I am very excited to feel the energy in this group that being part of Apereo makes possible. It was exciting to see the re-engagement of some of the people who brought so much passion to OSP in the early days.
  • The Learning Analytics Effort – There has been a small group of highly interested folks within the Sakai community interested in learning analytics activities for quite some time now. This has resulted in tools like SiteStats in Sakai. But as we gain understanding about the real approach to LA it becomes increasingly clear that analytics work must be done outside of the LMS with (again) many deep integration points. Add to the Tin Can API support in Sakai (and soon uPortal and OAE) it paves the way to take real steps in a separate software development project that is just about analyzing analytic data. This group is also pre-incubation but it looks like there is interest that is building on building shared open source software to analyze learning data from many sources.
  • Sakai CLE – I will talk more about this later in a separate section. June 2012 was really the time where the CLE started to re-emerge from being under the radar in the Sakai Foundation politics since about 2008. The 2.9 release (November 2012) and 2.9.2 release (May 2013) have greatly energized the community. Leading schools and commercial affiliates have enthusiastically jumped onto the bandwagon and many have converted or are converting to the 2.9 release. The 2.9 release has enough “good stuff” to make it attractive to move to the latest release. We as a community are reducing our installed version skew and that is very important for long-term sustainability. If we put out a version and no one installs it – it is game over.

In addition to these efforts, there were many other ideas bouncing around the hallways, breaks, and pubs. What was nice was to say over and over – “Hey that could be a new Apereo working group!” – What was most exciting for me was these working groups would have had a tough time being part of Sakai with a foundation that was dedicated to one (or two) core products and far too much debate about what should get the “resources”. In Apereo with independent projects large and small and laissez-faire approach by Apereo, each project builds its own small sub-community and finds its own resources. It is amazing how this Sakai+JASig community has so many ideas as what to do next – but when we were the “Sakai Foundation” the question of “Is that Sakai or not?” kept most of these nascent efforts from gaining forward inertia. With in Apereo – there is little to slow a small and dedicated group from moving an idea forward.

The Sakai CLE

So while I am excited about all the wonderful diversity of thinking that now makes up Apereo, the Sakai CLE is my personal focus. As I said above, 2012 was a breakout year for Sakai with the 2.9 release and quick uptake within the community. I felt that the Sakai Technical Coordination Committee (TCC) activity level and commitment level at last year’s conference was amazing – the talent and commitment brought forth at that meeting powered us to completing the 2.9 release and brought the Sakai CLE up to par with the rest of the LMS marketplace.

I gave a talk titled Sakai: The First Ten Years and the Next(*) Ten Years that both celebrated how far we have come and talked about the upside for the future of Sakai once we have the basics nicely in place in Sakai 2.9.

On Sunday June 2, the TCC met with a focus on governance issues in the new Apereo world. We made some progress on gnarly issues but like the merger itself, evolving the Sakai CLE governance will be a work in progress for a while as we find the right shape for things in the new world. The good thing is that we are not slowing down to talk governance – the technical bits of Sakai continue to evolve and the governance bits will be addressed over time.

In the area of socialization, after the Sunday meeting we had the second-annual Festival of the Dead Cow (or Festival of the Spinach au Gratin, Sweet Potato Casserole, and Sautéed Mushrooms for the vegans of the group). We also had our traditional Karaoke Night out at Karaoke 101 in northern San Diego.

On Thursday June 6, we had the more technically-focused meeting of the TCC as we laid out the areas where folks were intending to work over the next 12 months and trying to come up with a release schedule that we could collectively meet.

What was exciting was that it has started looking like the new capabilities of the release scheduled for roughly June of 2014 might have enough new functionality to justify a new major release (like any good committee we are endlessly debating what number that might be). Things like the new search from AsahiNet, the TinCan API project from Unicon, the dashboard tool from Michigan, Project Ketai to enhance web services and build a mobile app, LTI 2.0, greatly enhanced Lessons capability and many more start to sound like a very exciting release where are very much moving forward and starting to address real emerging market needs rather than focusing on building basic functionality.

Looking Forward

While all this is exciting as our existing community will function so much more smoothly within the Apereo Foundation, I think that the most exciting things for Apereo are the ones we can neither imagine nor predict. With the “big tent” approach baked into the bylaws of Apereo, an existing thriving open focused community with participants and conferences around the world, and a solid incubation process emerging, I am hopeful that we will start to attract small new groups from areas outside our traditional LMS and portal spaces.

I would love to see new projects like an open source EPUB3 book authoring environment or building MOOC platforms or LTI tools to share across many LMS systems become part of Apereo.

The sky is now the limit for Apereo and it will be fun to see where it goes.

Trying the Get the History Right – The Sakai Board of Directors and “Sakai 3”

Updated: More dialog has happened and I have added it at the bottom of this post.

Michael Feldstein has written an excellent post titled The Death and Rebirth of Sakai OAE. Michael correctly celebrates the outstanding progress of the Apereo OAE Project. The OAE presentations at the recent Apereo Conference were outstanding and the stakeholders and development team seem very well aligned. These are all developments that make me more optimistic about OAE than I have ever been before.

But since I am an amateur historian and keenly interested in what it takes for Open Source to work, I felt that Michael’s post conveniently missed one very important point in historical account. So I wanted to make sure that the record reflected the significant role of the Sakai Foundation Board of Directors role in the “unfortunate Sakai 3 situation”.

By the way – I am not speaking for anyone except myself in this post.

My Comment to eLiterate

A great post. I too feel that the Apereo OAE project is well positioned for the future with a solid technical underpinning and the right set of stakeholders going forward.

I think that you mistakenly gloss over the culpability of the Sakai Foundation Board of Directors in the “failing” period of the “Sakai 3″/OAE from 2008-2011. The problems of the “Sakai 3” effort in their simplest forms was over confidence and making too many promises and not delivering on those promises. The seeds of this were sown at the 2009 Sakai Foundation Board retreat where it was decided that making a “Sakai 3” was so important that the board authorized spending at levels higher than incoming revenue to hire a product manager, marketing person, and continue to fund a UX person as top priorities. This deficit spending continued with board blessing until late in 2010 at which point the foundation was bordering on bankruptcy. Finally, faced with bankruptcy, the board backed away from the policy that “OAE was too big to fail” and worth putting the foundation itself at risk to save the OAE. The board members and foundation coupled this ill-chosen financial strategy with effuse public “sales pitches” about OAE everywhere they went. It is not surprising that the community believed the foundation board and leadership – so it is really unfair to blame the badly misled and mis-informed community. The “marketing” worked – sadly it was not followed up by a timely product. It was at the Sakai conference in Denver in 2010 when the Sakai board grudgingly accepted that the Sakai CLE product did not need to be shut down and allowed for the formation of the Sakai Technical Communications Committee (TCC) to govern the CLE relatively free of board meddling. Even throughout 2011 and even in 2012, some board members yearned for a time where the board set the community agenda for both CLE and OAE by board fiat – but thankfully those notions are fading into memory as we embrace the new “big tent” and community-centered philosophy that underpins the Apereo Foundation.

Reference: http://www.dr-chuck.com/csev-blog/2010/11/sakai-board-elections-2010-edition/

The lesson to learn is that the top-down management and marketing-driven approaches that “sort-of” work in the private sector – utterly fail in open-source communities where the participants (people and organizations) are volunteers.

I am really excited about the future potential of Apereo OAE to be a next-generation academic product for all the reasons that you cite in your post. I have great confidence in the team and the remaining stakeholders seem in it for the whole journey wherever it leads. We all need to applaud their efforts so far and look forward to more good things from OAE in the future – without adding too much in the form of expectations from the outside.

Ian Dolphin’s Comments

Ian Dolphin pasted a great comment about the time period from March 2010-December 2010:

The financial situation in 2010 was as dire as Chuck represents. I’m not aware in as much detail as Chuck of the reasons for what amounted to a systemic overspend, or when that originated. I suspect the point of origin for the overpsend to lie before 2009, to be frank. In demonising the Board, however, Chuck introduces a perspective which is misleading. The reduction in Foundation spending was begun by Lois Brooks, as interim Executive Director in February/March 2010. …

I agree with everything Ian said in his comment. Ian also made a blog post in December 2010 that I also agree with.

I made a follow on comment to clarify my statements and agree with Ian.

My Second Comment to Michael’s Blog

Ian – Your narrative of when the board became aware of the dire financial situation accurately places board awareness of the problem in the February / March timeframe and accurately credits Lois Brooks with doing a great job as interim in addressing the financial situation during the March-June timeframe. She did a great job as interim and when you came on in June you continued to do a great job in addressing the financial challenges. We would not be here if it were not for both of your excellent leadership during difficult times.

But the anti-CLE and pro deficit spend policy was strongly held at the board level up to the very last minute. I distinctly recall one board member making an impassioned plea that “deficit spending was essential to the success of OAE” in a late January/Early February meeting as financial concerns were discussed. A few weeks later we saw balance sheets that showed how truly grave the situation was and the talk of “strategic deficit spending” instantly gave way to talk of how to avoid bankruptcy. The CLE-as-deprecated board policy persisted until the June 2010 board meeting in Denver *after* the OAE project suffered major stakeholder pull out (May 2010) and about four months after the board became aware that it was effectively bankrupt. And even at the June 2010 meeting – changing the policy to allow both the OAE and CLE to continue as equals under their own independent leadership was accepted grudgingly by some of the board members at that meeting.

I have the ultimate respect for the OAE team and people involved in the project. The OAE team threw themselves at an impossible task back in 2008 – they raised funds (the board and foundation staff helped fund raise funds in a good way) and tried a bold form of governance for the project – they were forever on the edge of emerging technologies and gave us a beautiful view of what the future would look like. I will defend and support the OAE team past, present, and future. The OAE/CLE schism did not come from either the OAE or CLE – it arose as a result of a board of directors that felt that they had the authority to manage volunteer resources as if those resources belonged to them.

It all worked out (whew!) and I (like everyone else) want to move on. Because I think that the future is very bright for both OAE and CLE.

Receiving Grades IMS LTI Outcomes: Signed sourcedids

This is a design for an approach to securely accept grades from an external tool and put those grades in the grade book. Please review this design for security and feasibility.

Overview of the Structure of IMS Outcomes

When you look at the IMS Learning Tools Interoperability 1.1 (LTI) Launch, there is a field called the lis_result_sourcedid. If the administrator and instructor decide that a particular resource_link_id is to send grades to the grade book, they must create the column in the grade book and indicate which grade book column to accumulate grades for the particular resource.

The lis_result_sourcedid must contain enough information to uniquely identify the resource_link_id, course, and particular row and column to store a grade for the user. The value will be different for each tool launch from a different user. When the external tool wants to set a grade for a user, it must present the lis_result_sourcedid for that user_id/resource_link_id combination. This value is completely opaque to the tool – the tool is not supposed to be able to parse or otherwise understand this string – the tool must simply receive the value, store it and then present it when attempting to set a a grade.

This gives the Tool Consumer a wide range of choices as to how it constructs the lis_result_sourcedid. This document describes the particular approach that I use and recommend to create this lis_result_sourcedid.

Data Model for Resource Link Level Grade Secrets

To support this feature, we will add the following data items in the LTI Resource Links. This value need not be visible to the Instructor and there is little value revealing the value to the Instructor or admin.

imslti.gradesecret=random UUID

The LTI code generates the secret internally once the instructor or admin has indicated that this resource_link_id is supposed to accept grades.

During Launch: Constructing the lis_result_sourcedid

The essential data needed in the lis_result_sourcedid is the resource_link_id and the userid. The key is to make sure that the lis_result_sourcedid cannot be tampered with while the tool in possession of the lis_result_sourcedid. The base string will be as follows:

gradesecret + ':::' + resource_link_id + ':::' + user_id

A SHA1 signature will be computed from that base string and the lis_result_sourcedid sent in the launch will be:

signature + ':::' + resource_link_id + ':::' + user_id

All the other information regarding the grade book comes from data stored in the content item associated with the resource_link_id so there is no need to replicate this information in the lis_result_sourcedid. These is little reason to otherwise encrypt the lis_result_sourcedid as it simply contains information the external tool already knows.

So the only protection needed is to insure the integrity of the resource_link_id/user_id using a simple message signature. Encrypting the lis_result_sourcedid further would obfuscate the information for no particular increase in security but it can be done if the TC wants to do so.

The resulting lis_result_sourcedid will be sent with LTI launches for which the instructor has configured the tool to receive grades.

Use Case Walk Through

This section walks through the entire steps of the process in order.

Instructor or admin places the LTI resource link and configures with the url, secret, and key.
Instructor or admin uses the LTI config UI to indicate that the tool will be sending grades, and picks the grade book column to store results if necessary. This causes the imsti.gradebooksecret property to be set with a random UUID if it is not already set.

Student launches the tool in the consumer The launch includes the lis_result_sourcedid which is the resource_link_id and user_id and an integrity signature based on gradebooksecret.

The Tool Provider stores the lis_result_sourcedid for each user_id in its tables somewhere, remembering the oauth_consumer_key as well.

Student uses the tool and earns a grade, or perhaps the student uses the tool and the instructor goes into the tool and grades the student work and requests that grades be sent back to the TC.

Either as a side effect of the student completing the work, or the instructor pressing a “send-grades” button, the tool provider creates a Outcomes service request message including the lis_result_sourcedid and sign the overall message it using OAuth using the oauth_consumer_key which the Tool Consumer used to sign the launch request.

The overall message signature establishes that the TP indeed sent the message and that the message contents were not modified while in-transit.

The service message is sent to the designated service URL on the tool consumer.

(1) The TC parses the lis_result_sourcedid, producing a signature, resource_link_id and user_id.
(2) The TC looks up the content item using the resource_link_id
(3) It then pulls gradesecret from the content item and checks the lis_result_sourcedid signature
(4) It then looks up the oauth_consumer_key and secret from the content item (or from a system wide registry) and checks the OAuth signature of the message.
(5) The TC verifies that the user_id is a member of the context_id that contains the content item (resource_link_id).

If all of the above tests pass – we set the grade.

Advanced Concepts (Optional)

Since these secrets are system generated and not user entered, a TC can come up with a “rotation”policy to change the grade secret periodically for each resource_link_id. To do this we add two new data items to each resource link associated with a resource_link_id:

imslti.gradesecretdate=Date The gradesecret was set
imslti.oldgradesecret=The Immediate prior secret

Based on some time scale chosen by the TC administrator, the TC can go through and “expire secrets” that are older than some desired time period. The regeneration process can copy the current value for gradesecret to oldgradesecret and then generate a new gradesecret and update the gradesecretdate. The verification process above is updated to check both the gradesecret and oldgradesecret and accept either as valid.

The launch process simply uses the latest gradesecret when it signs the lis_result_sourcedid during a launch thus extending the expiry date on the lis_result_sourcedid.

If the TC system automatically regenerates all gradesecret values after 15 days, it will appear to the TP that signed lis_result_sourcedid values expire after a minimum of 15 and maximum of 30 days depending on the relative timing of the generation of the gradesecret and the generation of the lis_result_sourcedid.

A gradesecret and/or oldgradesecret can be manually altered, removed or regenerated at any time to “invalidate” all outstanding lis_result_sourcedid values.

Sample Source Code

This is implemented in the Sakai ServiceServlet here is the code below:

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-blis/src/java/org/sakaiproject/blti/ServiceServlet.java

Look for the doPostXml method and walk through it (patches welcome). There are libraries of code elsewhere in the source tree – but this is the starting point for Sakai’s outcome service code.

Conclusion

This document introduces the notion of a signed lis_result_sourcedid that allows very fine-grained authorization down to the individual user / resource_link_id if the TC so chooses. The approach includes provisions for grade secret revocation, and seamless grade secret expiry and rotation.

Comments welcome…

Original document written: Version: 3 – August 18, 2010