Coursera Never Ceases to Amaze Me – Community Teaching Assistants And a Few Other Things

I am now teaching my Internet History, Technology, and Security course on Coursera for the third time. You might think that by the third time I teach a course online it would become routine. But that is not the case.

I love being part of watching the Coursera software and culture growing and improving under our feet. The engineers, course operations, and support folks at Coursera are a pure joy to work with. They come up with new ideas and put them into production so quickly – it amazes me. When I heave a new idea about how something should work and propose it in the partners forum, over half the time – they already implemented my idea and I had not discovered it yet. Other times they take my idea – add to it and then often in less than a month there is a new feature. Only rarely do I want something and not get what I want – usually the reason is security – which I fully understand – but I want to always get my way – since I am so spoiled.

Two weeks ago I spent a week at Coursera HQ working on the code for a feature that I hope will soon be released. I worked with Brennan, Nick and Michele (the new intern). I learned a little about SCALA and by the end of the week I “sort-of” get this new-fangled functional programming movement. I hope the see that feature soon – I will likely go back to SFO when it is done to celebrate because it means a lot to me personally.

In terms of culture, I could not be more excited about the Community Teaching Assistant (CTA) program as led by Norian Caporale-Berkowitz. CTAs are selected from the outstanding students from previous courses who have both mastered the material solidly and shown a natural inclination to teach their fellow students. They volunteer to be in the next session of the class and help in creating the culture of the next round and to be close to the next round of the students and help them through the materials in the course.

What is especially cool is that we have a special forum for the CTAs and Teaching staff for the course where we discuss and solve problems and they help make sure that things are brought to my attention quickly that are important. I still am in the class discussions and do most of the content creation for the class – but I also have a group that can review my new materials before I release them and catch problems. I spend about an equal amount of time in the course forums and TA forum.

What is most exciting to me is how much it feels like face-to-face teaching like I do at the University of Michigan. We have a team of people trying to make the learning experience as good as it can be for everyone. I depend on my teaching assistants and they depend on me. I make stuff and they help me form the stuff of the course and then help the students understand the stuff.

This feels very scalable and fits my model of teach the teacher. I so badly want to see my class material reused and remixed by teachers around the world at other schools, community colleges, and high schools – but I don’t want to completely lose touch with the teachers in those classes and even the students in those classes. I am imagining a situation where a class like my Python class is literally being taught at hundreds of places at the same time to many thousands of students – and we have a set of forums that allow the whole community of teachers to be able to share ideas and help and support each other.

Of course software will need tweaking and we will need to invent new cultural and interpersonal models as we scale things up – but it feels really wonderful. I have this sneaking suspicion that we are slowly re-inventing the Open University model but with less focus on waterfall design of course materials, allowance for translation and remixing, and reduced focus on instructional design. What we will adopt is the lead faculty / mentor model – but with more permeable boundaries and more agile content.

It is a great time to be watching new roads being laid down at the frontier of all of this. Many will say – that there is nothing new here. The statement that there is nothing new in Coursera/Udacity/edX/NovoEd style classes is trivially correct but completely wrong.

The technology we use is far better than anything that came before – but more importantly we are creating multi-layer, cross-cultural, cross-disciplinary, and fresh human structures that have *never* been part of online education. These new efforts like Coursera/Udacity/edX/NovoEd are breaking new ground in the increasingly important connection between humans, information, and technology in the quest to educate and learn. Nothing that came before is even close in simultaneously exploring all three dimensions this new space. I only wish more people could experience the joy I feel when teaching in Coursera.

And who knows where this will lead, with edX now open source….

Being Blackboard’s Sakai Chief Strategist – One Year In

Note: I am not speaking for Blackboard in this post and I am not speaking for the University of Michigan in this post. The opinions in this are my own.

Just over a year ago, in addition to my full-time job as a Clinical Associate Professor at the University of Michigan School of Information, I became a part-time employee of Blackboard, Inc. as with a title of Sakai Chief Strategist (My Blog Post / Press Release)

I figure it is probably a good time to give a bit of a status update as to how things went this past year and what I did.

A little more context on my decision / strategy

You can go back and read all my motivation, rationalization, and plans from my blog posts from late March and early April of last year. While all I said was completely true – there was one small detail in my motivation for going to Blackboard that I somewhat understated a year ago. You can see a running theme of Sakai CLE resources in those posts but not up-front and explicit. Now a year later I can be much more explicit.

Early in 2012, things were looking really bleak for the Sakai CLE. The progress toward the 2.9 release had slowed and eventually stopped. I was really worried that if Sakai 2.9 (most importantly Rutgers LessonBuilder) did not ship that Sakai 2.8 would not be able to hold its market share because of Sakai 2.8’s lack of structured and sequenced content. But I felt that 2.9 with its new portal and LessonBuilder would be a solid and competent LMS that would have a long life in the US and around the world.

I did not want to quit so close to the finish line. If you recall, Sakai 2.9 was in *beta* when the TCC cancelled all further release activity.

In January – March 2012, I felt that we were seeing the end of Sakai before our eyes. In this blog post from March 31 last year – I get a little testy and call out the Sakai community for its lack of investment in the commons. You can see my frustration, anger, and fear with respect to Sakai’s long-term survival in that post.

Michael Chasen was willing to give me money and resources to invest in Sakai so we could finish and ship Sakai 2.9. He would put little or no constraints on how I spent the money – it was mine to spend as I saw fit. It was not enough money to take over Sakai development and release management but I gave him a figure that I felt would get things moving again when added to the rest of the Sakai community resources.

You can look at the Sakai 2.9.0 release cycle document to see that 2.9.0 was finished November 12, 2012. The code freeze and first release tag (A01) had been created 13 months earlier on October 17, 2011.

I want to make it really clear that many people deserve credit for the 2.9.0 release. My contribution and Blackboard’s contribution to 2.9.0 was non-trivial but many others contributed much more than Blackboard or me. It was a cross-community effort and I was only a part of that effort – which is as it should be.

What Got Done? How Did You Spend Your Time? How Did You Spend the Money

Here is a list in no particular order.

  • I bought food, drink, stickers, and shirts for community leadership. Sorry – but this is important. For those of you who have been in Sakai from the beginning, you may remember situations where I used the University of Michigan Credit card (backed by grant funds) to pick up the tab for 40-50 people at a time. If you are going to volunteer your spare time to doing hundreds of hours Quality Assurance or software development – then *someone* should at least buy you a meal or two to say “thanks”. I don’t hesitate to use my Blackboard American Express Card to pick up the tab when I am sitting at a table with a bunch of amazing community volunteers. I took the entire TCC and a few guests to Ruth Chris at the June 2012 meeting. I turned in an expense of $3700 for the TCC dinner. It was approved with no questions asked. It was a bargain given the amount of work that the people eating those steaks have contributed over the years.
  • I contracted with developers in the Sakai community from June 2012 – February 2013 to work on resolving any and every outstanding Sakai 2.9 issue they could find. When I paid these contractors I did not ask that they make any public statements about the source of their funds. This was not about getting credit for Blackboard per se, it was simply to get the product out the door with whatever it took.
  • I paid for travel for several people in the Sakai community who could not otherwise attend meetings where I felt their presence was very valuable and their organizations could not afford their attendance. Again, the funds were given without any requirement of public announcement that the funds came from Blackboard. These funds were a gift/grant because I wanted the particular person to be at the meeting – it was not to make it about Blackboard.
  • I travelled to the Sakai meetings in Atlanta, Paris, and Puebla Mexico as well as had Blackboard pay my way to the Sakai Foundation board meetings while I was still on the board. I also went to a Moodle Moot in Los Angeles. I gave talks at each of these meetings – mostly focused on bringing some excitement back to the CLE and making sure that everyone knew how awesome Sakai 2.9 was.
  • Blackboard paid my part-time salary and made it possible for me to spend nearly all my own spare time working on Sakai. Beyond bug fixes and release support for Sakai 2.9, the my primary large developments were to completely replace the Sakai Web Content tool with a JSR-168 portlet that eliminated the double iframe problem and allowed us to deal with sites that are starting to set the X-Frame header. The second development was a major investment in cleaning up the LTI code in Sakai and release a new version of LTI with Sakai 2.9.2 that fixed over 45 problems that were identified in the LTI code from the 2.9.0 release. Both the new Web Content tool and LTI code should be in the upcoming 2.9.2 release. This was all supported by Blackboard. University of Michigan (Beth, Zhen, and John) can also take a bunch of the credit for the new LTI code as well. Matt Jones and Sam Ottenhoff of LongSight also helped with the Web Content tool.
  • I even spent Blackboard funds to send myself to a purely academic conference. I figure I am an academic researcher – I should go to an academic (i.e. not industry) conference once in a while. It was my first non-industry conference in years.
  • I installed a Sakai 2.9 QA server on my Blackboard-provided server hardware. I wanted this so I could do more complete tests of Sakai’s increasing support for LTI-related web services.

So What Did You Do For Blackboard Last Year?

I did do a few things for Blackboard that weren’t just working on Sakai.

  • I advocated internally for quick implementation of LTI 1.1 in ANGEL – which was completed in mid-summer and announced at BbWorld 2012 in July. I joyfully filled out my IMS Ring of Compliance with the ANGEL logo and got the tattoo in New Orleans with Linda Feng doing the filming.
  • I did a review of the xpLor integration API and made suggestions that led to a new version of the API that makes it more reasonable to integrate xpLor into an LMS at “arms length” – it made xpLor more like just anther external tool in terms of its access to information within the LMS. I hope this new API will be the basis for a general-purpose Learning Object Repository Integration (LORI) API in an upcoming version of IMS LTI 2.x and that such an API will allow many tools (i.e. not just xpLor) to enjoy deep integration with LMS systems.
  • I built the API code to support the revised LORI API so that Blackboard’s xpLor (as well as other LOR products) can smoothly integrate into Sakai’s Lessons capability as of Sakai 2.9.2. I also built PHP sample code/unit tests for that API.
  • I have been interacting with the Blackboard Engage (formerly Edline) group that sells a very successful K12 CMS system. I am working with them as they explore LTI integrations. (I am not announcing any delivery of any product here)
  • I have been attending IMS quarterly meetings and Learning Impact to continue to move IMS LTI 2.0 and other IMS activities forward.
  • I have been working with Blackboard Collaborate (formerly Wimba/Elluminate) as they explore the LTI integration. They are in pretty good shape on their own but I help a little bit here and there when they ask.
  • I worked with my new colleagues from Moodlerooms. But really I mostly just learned from them. We went through the design of the Moodle 2.3 LTI code and its design approaches – and I stole many of those ideas for LTI in Sakai. They showed me some of their secrets of how they get Moodle to scale so well. Their approaches really informed what I think we should do in Sakai if we want to become a SaaS application. Seeing the Moodlerooms SaaS implementation makes me just a little jealous.

There were several things I did not do in the past year. First, I am not allowed to work on closed-source products directly as part of the employment contract between University of Michigan, Blackboard, and me. So I have not written any code for Learn, xpLor, or any of Blackboard’s non-open source products. This has not been a problem – I have been plenty busy with Sakai and IMS things and Blackboard has plenty of really talented folks working on those products. The second thing I have not done is any kind of sales support. Three times in the past year a sales person in a situation where Sakai was involved in a possible Learn sale has contacted me asking for help. In each case I have politely declined. My management (in the Blackboard engineering organization) has put absolutely no pressure on me at all to do any sales support at all. I doubt they even knew I was contacted. I have answered a few technical questions about different versions of Sakai or how to convert data but nothing on the strategies or tactics of a particular sale.

So I feel very comfortable and feel no conflict of interest. If Blackboard products fare well in the marketplace I am happy because my budget for money to spend on Sakai goes up. But I have had no problem at all remaining 100% loyal to Sakai and the Sakai community over the past year as a Blackboard employee – in particular because my management expects me to be committed to Sakai and the Sakai community.

So What Went Wrong?

Nothing is perfect, right? There must be some disappointment.

If I look back over the year, the only thing that leaves me a little disappointed is that I really wanted to get to the point that I could use Blackboard’s performance testing lab and quality assurance processes and apply them to Sakai as a way to increase overall resources available to the community. This would be an amazing contribution if I can pull it off. I made some progress on this last summer when the performance test lab in Blackboard (which is awesome) did some performance testing on Sakai and identified some areas that could be improved.

I really would love to have gotten that work finished and presented to the Sakai community – but I just got too distracted by other things to stay on top of that task and bring it home. And once Sakai 2.9.0 shipped it seemed to me to be less pressing. Perhaps in time I will come back to this task and finish it. But even now, it is kind of on my back burner.

Not much else went wrong.

So What Is Next?

In general, much of the roadmap of the Sakai CLE is discussed and set at the annual meeting – this year in San Diego. So some of these priorities might get adjusted after that meeting. But for now, these are the two tasks I will set out to accomplish this summer as my Blackboard contribution to Sakai:

  • I want Sakai to be the first LMS to ship LTI 2.0 support. While LTI 1.1 is great and nearly universal, it is starting to fray at the edges as each LMS pours more and more extensions into it. These extensions take widely different approaches, use different formats and web service interaction patterns. There is no interoperability and no conformance tests for these extensions. LTI 2.0 gives a way to solve all these problems – but we have to get started before there is any payoff. So I plan to write the Sakai LTI 2.0 support, a full set of PHP sample code to compliment the Rails sample code developed by John Tibbets and contributed to IMS, and work to get LTI 2.0 into Moodle through my MoodleRooms colleagues. I will also start building sample LTI 2.0 tools and write LTI 2.0 documentation to help evangelize LTI 2.0 to other LMS systems. This is a long task – but the best time to get started is now.
  • Once I have LTI 2.0 underway I want to circle back and look at IMS Common Cartridge import and export in Sakai. Chuck Hedrick has done a great job with Lessons in terms of CC import and export – but I want to expand it to interact with everything in Sakai – not just content in Lessons. I want to look at interoperability of the cartridges in a way that supports open educational resource use cases.
  • I will be teaching a Python MOOC on Blackboard’s CourseSites platform. I want to use this as a way to learn how to teach using Bb Learn and explore some of the cool features of Learn as well as spread my Python material to a few thousand more students through yet another channel. I also expect to serve as an early heavy user of LTI in CourseSites to make sure that it is easy for others who come in after me. I want to also play with the nice Common Cartridge and Open Educational Resource support in Learn as well – again to serve as an pattern for others to follow building MOOC / OER courses. Here is the link to enroll in my course (scroll down to Python for Informatics).

I would like to increase Blackboard’s direct support of the Apereo Foundation. We spent three+ years merging Sakai and JASig – for me it is time to invest in Apereo so it can move into the kinds of wonderful new areas we had imagined as we designed the merger.

Summary/Reflection

It has been a heck of a year. Releasing Sakai 2.9.0 and (soon) Sakai 2.9.2 will be really important milestones for me. My own measurement of the value of my Blackboard activities is simply that the Sakai community is thriving and healthy and the product continues to move forward and improve.

Back a year ago I told people that this would all be “no big deal” and everything would be fine. I hope that people now see a year later that this is indeed the case. Blackboard has gently supported the Sakai community in appropriate ways and without fanfare. As was stated March of last year, Blackboard intends to have a healthy engagement in open source activities like Moodle and Sakai and do so in a way that advances the causes of those communities in order to have a healthy open source ecosystem in higher education.

When I look at both my involvement in Sakai and the Moodlerooms team’s involvement in Moodle – I am pretty pleased and proud of what has been done so far.

As always, comments welcome.

One More Day of Thought – Introducing CC-One (Formerly CC-Infinity)

I am totally geeked and somewhat histrionic that CC took a look at my plight. Elliot Harmon of CC commented on my post about CC-∞ (Infinity) from yesterday.

… Thinking about CC Infinity, I worry that it would create an infinite number (sorry) of incompatible bodies of work. The exciting promise of OER is the ability to seamlessly mix content together from different sources. Navigating a complicated set of restrictions would make life much more difficult for educators and content creators….

Since my response is really long I decided to make it whole post.


Elliott,

Thanks for your comment. I feel good that the topic is receiving some discussion at CCHQ. When I say “content slums” – I mean any cloning of material for the sole purpose of making money off ads, getting into search results or taking away views. YouTube is not a content slum – but a YouTube channel with nothing but cloned content is a slum (in my vernacular). But my definition hardly matters.

The problem in a sense is the kind of thing that happens when something like CC-BY is very successful – I used it on everything. But at some point my fragmentary bits come together in something like a whole book or whole course and after years of development and promotion my work starts to get some attention. But that very moment that I get attention for my work is the exact moment when bottom-feeders can gain the most advantage by cloning those materials.

There comes a time where one needs something that is more precise than the CC-BY series. One might say “use NC” and that will keep people from cloning content on YouTube with ads. But if they are caught – they turn off ads for a few days and then when no one is looking they turn them back on. If all they are doing is cloning materials, they are complying with ND. And they are not trying to limit others from spamming – so they are complying with SA. So all the CC additions are pretty much useless in the face of those whose intention is to clone (and not remix or add value to) materials.

The answer is ARR with pre-granted permissions. In order to avoid the “incompatibility” you speak of above, I would word all the permissions in the following form:

If you are printing a limited number of copies of this book for use in a course,
then you are granted CC-BY license to these materials for that purpose.

If you translate this book into a language other than English,
then you are granted a CC-BY license to these materials with respect
to the publication of your translation. In particular you
are permitted to sell the resulting translated book commercially.

If you are hosting these materials on a server not connected directly to
the Internet (i.e. behind a firewall) to better serve a local population,
then you are granted a CC-BY license to these materials for that purpose.

If you are creating a derived work that includes more than 50% and less than 90% of
this content then you are granted a CC-BY-SA-NC license to these
materials for that purpose.

If you are creating a derived work that includes more than 5% and less than 50% of
this content then you are granted a CC-BY license to these
materials for that purpose.

If you are creating a derived work that includes less than 5%
this content then you are granted a CC0 license to these
materials for that purpose.

By limiting the statements to when a non-ARR license can be used but insisting that those licenses be from the existing CC set hopefully means that the only complex legal interpretation will be the “when to apply” parts of the statements and not the “what happens” part of the statements. Of course I am not a lawyer… :)

You can sit in a room at CCHQ and convince yourselves that such a thing would somehow confuse the CC brand. It indeed might. And so CC might decide not to do it. But just because CC does not build such a thing, it does not mean that thing thing is not needed and it does not keep someone else (i.e. like me) from building such a thing.

I have decided that CC-Infinity is a bad moniker for the idea. Yesterday I was in a hurry and trying to figure what the “opposite” of CC0 was. My new “opposite of CC0” is CC1 – CC-One.

With the addition of CC-One, there is a delightful slider bar of options on a number line. CC0 would be at 0.0, CC-BY would be at 0.25, CC-BY-SA would be at 0.5, and CC-SA-NC would be 0.75, CC-BY-SA-ND-NC would be 0.85, and CC1 would be at 1.0. It is beautiful – CC1 completes the set perfectly. CC0 starts from PD and works up while CC1 starts from ARR and works down while CC-BY populates useful stopping points in the middle.

I love the symmetry – as an engineer it feels like it is now complete.

I also understand if CC thinks that if you make CC1 it will be come too popular and folks will abandon CC-BY, preferring CC1 even for little things like a Flickr photo. This might reduce the overall amount of CC-BY.

I would disagree, I think that CC1 would mean more people would find one of the CC licenses suitable for far more materials. Some might start with CC1 to dip their toes into CC and then after becoming more educated and comfortable know when they want to use the CC-BY series. I think that CC1 might lead to a short term drop in the use of CC-BY and friends – but in the long run – by making the CC language more expressive and widening the range, we can involve far more content creators in CC overall. Every crack we can put in ARR is a step in the right direction. And frankly something like CC1 might give mainstream publishers and content producers a way to loosen their grip ever so slightly in a way that more slowly achieves our common goals – but does so for a far wider range of materials.

Comments welcome.

P.S. You really need to look at the Bill Fitzgerald post on Creative Commons and Human Nature where he talks about the recent improvement in the Createspace policy that addresses the rights of the copyright holder when the copyright license is “non-exclusive”. It is a beautiful thing – the policy was changed from “first in wins” to “copyright holder wins” early last year. If CC had something to do with this – outstanding. If not, you should say nice things about it and try to get other distribution channels to adopt similar policies. I had a very unhappy run-in with Createspace back in 2010 that caused me a lot of pointless work – back when the policy was “first in wins” and the material was CC-BY-SA.

P.P.S. If I do build this “something less than ARR” framework, I won’t call it CC1 because then I would be sued (rightfully) for trademark infringement. I would love to call it “ARRGH” but I don’t yet know what the “G” and “H” stand for to make the acronym work. :)

The Day After CC-BY Fail – CC-Infinity

Yesterday was an interesting and emotional day for me.

  • I made a video of how upset I am at my own mistake of putting CC-BY on materials and having that decision play into the hands of spammers who would use my content for search or link bait.
  • I removed all the CC-BY references from my Coursera Internet History, Technology, and Security recorded lectures and replaced it with All Rights Reserved.
  • I got some help from Cory Doctorow (tweet) who noticed my situation and gave me some reassurance that YouTube would fall on the side of the copyright holder and not let the spam stand even with the small detail of the CC-BY license. I of course did not believe him (tweet)
  • YouTube did remove the offending videos – even the ones that were CC-BY. Cory was right. It only took two days. (tweet)
  • I made a new version of my histronic video addressing the issues independent of YouTube taking the videos down.
  • Bill Fitzgerald wrote an excellent post titled Creative Commons and Human Nature where he covers some of these issues with less histronics than me. In particular points out that Createspace has an excellent policy that nicely addresses the rights of the copyright holder when the copyright license is “non-exclusive”. It is a beautiful thing – the policy was changed from “first in wins” to “copyright holder wins” early last year. Worth a read.

So reading the above sequence of events you might think, “Great – You won against a spammer.” – but actually that is not at all how I feel.

I effectively used a bit of bluster and YouTube’s general tendency to do whatever the copyright holder asks to get my way. But in effect, I succeeded in revoking CC-BY after the fact and that makes me feel bad. I don’t want to break CC-BY – it was my mistake to use it and legalize spammers.

So I am still removing CC-BY from all my materials that I don’t want used as spam or link bait. I don’t want to face YouTube the next time and have them look at the actual copyright detail and decide that I have no recourse.

We still need a license that protects high-value OER materials from inappropriate reuse while enabling responsible reuse without any requirement of permission.

What we need – CC-∞ (Infinity)

I tenatively title my idea CC-∞ as homage to Creative Commons CC0. CC-∞ starts with All Rights Reserved as the default license (much like CC0 starts with effectively Public Domain) and then adds statements that define legitimate reuse scenarios for which permission is explicitly given.

In a sense if we look at CC-BY+SA+ND+NC it is structured as a liberal license that adds increasing restrictions based on the desires of the copyright holder. CC-∞ is the opposite – it is a restrictive license that adds clauses that make it more liberal in scenarios per the wishes of the copyright holder.

Much like in all of CC – we are best served if the smart lawyers at Creative Commons draft these up. All this stuff is so complex especially when international laws are involved. This is not something that I should draft up by myself – but unless CC builds something like this – I will be forced to define CC-∞ myself – and it will suck. But it will be better than any of the CC-BY variations and better than All Rights Reserved – but still suck unless Creative Commons steps up and does this work.

Reflection

I was really upset yesterday. But I should be clear that I was not upset at YouTube and I was not upset at the spammer. I was upset at myself for not anticipating this “CC-BY” gotcha. It is always painful when you assume that you are safe and doing the right thing and then something jolts you and makes it clear to you that you made a mistake. And you might suffer negative consequences for your mistake. Yesterday my intellectual property and copyright cheese got moved and I was scared and upset.

Now 48 hours later all is well. I will still keep CC-BY on much of my materials – for example, my Python for Informatics course on online.dr-chuck.com is 100% CC-BY and releases all its materials with CC-BY (here). I am not going to remove the CC-BY from these materials because I really want them distributed as widely as possible and and want to pre-permit unfettered reuse – even if a copy of the materials end up in a content slum. I have thought that through and am prepared for it.

Going forward, I won’t just put CC-BY on everything I create related to teaching and learning. I will put it on most of what I produce – just not all. I will ask myself the question, “Are you prepared to make spam-like reuse of these materials legitimate?” If the answer is “yes” – then I will use CC-BY and if the answer is “no” the answer will be my own self-constructed version of CC-∞.

Just in passing, I would just like to note that I am very explicitly not revealing *who* did the spamming. They probably did not have any truly evil plan – they do not deserve any particular attention or criticism. They were in possession of a few MP4 files and put them up on YouTube. I got them taken down – nothing to see here – move along.

Thanks for listening and as always comments welcome.

Creative Commons Has Failed Me and My Heart is Breaking

Update: YouTube did take down the spammer copies of my videos – I am glad this worked out but still will move away from CC-BY for some of my material

Update: Bill Fitzgerald wrote an excellent post about this and points out that Createspace actually has a policy about this. That is great news – it was not the case several years back. I also get accused of histrionics by one of the commenters in his blog. I probably am guilty of histronics.

I understand that this is the fault of Copyright Law and not Creative Commons per se, but I am at the point where I will be using CC licenses on my materials less and less.

I am 100% committed to allowing reasonable use, reuse, remixing, translation, republishing of my materials, even commercial and non-commercial.

So for me, I have been using the CC-BY license for years, wanting to give the maximum flexibility to those who would come into possession of my materials. I don’t want to add the SA, NC, or ND to my licenses because that limits the freedom of those using/adapting my materials.

It turns out that the only thing that I don’t want people to do is simply clone my materials with no value add at all and put up cloned copies of my materials on competitive sites as link and search bait. It is like my material is trapped in a content slum. You might think that search engines can tell the difference between me publishing my content and some scumbag replicating it in a content slum – but they can’t – when enough slums exist the original is lost in the noise.

These unethical spammers are not making derivative works (they merely clone my materials) and they are not trying to limit redistribution – so they are *technically* perfectly legal w.r.t. CC-BY.

If they did something like translated my work into multiple languages or even auto-tuned my lectures it would be awesome and great.

So for now, I am going to start converting my materials away from any CC license unless I am willing to have 1000 useless spammers duplicate the materials I am creating. Some materials I will still release as CC-BY – but my richest and most well-developed materials will be All Rights Reserved with some kind of asterisk.

Perhaps I will write up my own Copyright License that tries to give flexibility and options to those who would use my materials responsible manner while prohibiting evil spammers from using my work as link bait.

I doubt that there is any legal way to capture what I really want. Sadly, “All Rights Reserved”, while reprehensible at least gives me recourse when spammers decide my stuff is worth ripping off.

Sad sad sad. My heart is breaking. It almost brings me to tears to think about it.

It would be great if I were wrong – but I don’t think I am.

Note: I would add in passing that software can use a trademark to protect this brand while allowing flexible copyright licensing. But since books videos and other similar materials cannot take advantage of trademark protection for a brand I have to fall back to All Rights Reserved.

Note: My son Brent (the musician) is sitting here doing his Algebra homework and watching as I write this. He looked closely at CC for his work a few years back and felt that it made no sense at all. He would almost certainly let anyone who asked use his work – all they have to do is ask. He sits here wondering why it took me so long to figure it out and why I am feeling so bad about switching to All Rights Reserved and just saying ‘yes’ to requests for reasonable reuse.

Update: Commenters pointed out that spammers will ignore copyright – that is of course true. But if those spammers are using a well-known site like YouTube or Amazon Createspace as the outlet for their competitive clones of CC-BY materials – those sites will correctly say “too bad” when you ask them to take it down. Google’s web search is better at catching and punishing “content slums” – but other sites search engines like YouTube and Amazon that are merely looking at their internal content can’t tell the difference between the original CC-BY content and a useless duplicate.

Reusing Parts of Sakai’s LTI Java Implementation

I have been recently talking to several folks about LTI implementations in Java environments and whether there is reusable code in the Sakai course tree.

The answer is ‘yes’ – and I have structured the Sakai source tree to make this as easy as possible by isolating the generic code form the Sakai code. The “generic” code in the Sakai course tree is based on the IMS code and is copyright IMS (and others) under the Apache License. The Sakai-specific code is copyright by the Sakai (Apereo) Foundation under the ECL 2.0 license. The Sakai-specific code is useful as an example but not particularly useful as direct reuse.

The reusable (non-Sakai) code is here:

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-util/

The Sakai code that makes use of these utilities is scattered throughout, but here are the high points:

Code that calls all the Sakai APIS and fills up the data structures and exercises the above utility code.
https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-common/src/java/org/sakaiproject/basiclti/util/SakaiBLTIUtil.java

The servlet that contains the web services to handle this incoming web services – this of course is Sakai specific, but the outline is useful and it makes good use of the utility code above:
https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-blis/src/java/org/sakaiproject/blti/ServiceServlet.java

A php test harness that pretends to be a simple tool to launch and does some very simple exercising of the XML web service APIs:
https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-docs/resources/docs/sakai-api-test

Abstract: Massively Open Online Courses: Beyond the Hype

I will be giving a Keynote speech at the Moodle Moot Australia 2013 in Melbourne, Australia June 23-26, 2013.

This is my draft abstract for that keynote. Comments welcome.

This keynote will look deeply into what MOOCs are; how they are affecting the future of the software we build to enhance teaching and learning; and how the current trends will ultimately affect real teachers or real universities. We will get beyond the hype, contrast these new systems with more traditional Learning Management Systems, then anticipate how MOOCs will progress as they move through the Gartner Hype Cycle; become more prevalent; and potentially lose sight of re-mixable Open Educational Resources.

We will examine and debunk the fallacy of the one “gold standard course” taught by some premier university that effectively converts the professors in the “rest” of the universities into local graders / mentors. For those of use in educational technology for the past 20 years, we have seen extreme and unwarranted hype around numerous products which eventually modulates into a reasonably practical approach that strives to make all teachers better, rather than obsolete.

Dr. Severance has a long history of of being involved in disruptive trends in technology for teaching and learning. As the Chief Architect for the Sakai Project and first Executive Director of the Sakai Foundation, he helped form the worldwide open source community around the Sakai LMS. After he resigned as the Executive Director in 2007 (right before he was about to be fired because his passion for genuine open source conflicted with the grand top-down plans of his board of directors) he spent the next few years working with the IMS Global Learning Consortium building software and data interoperability standards to change the nature of the LMS marketplace. This work resulted in the IMS Learning Tools Interoperability (LTI) and IMS Common Cartridge (CC) support across then entire marketplace. He has also contributed to the LTI support in Moodle. In 2012, he became the Sakai Chief Strategist for Blackboard, Inc. and is paid by Blackboard to work on and support the Sakai open source project. Also in 2012, he taught the online course “Internet History, Technology, and Security” during 2012 using the Coursera MOOC platform. The course had over 56,000 registered students from all over the world and 5,000 received a certificate. In 2013, in order to teach a MOOC that would augment his on-campus class he developed his own open source MOOC framework (online.dr-chuck.com) that used Moodle as its teaching engine and taught Python to nearly 2000 students around the world in addition to his on-campus students.

bio: http://www.dr-chuck.com/dr-chuck/resume/bio.htm

LTI, Frames and Cookies – Oh MY!

I got the following question about LTI recently:

Recent Safari and Chrome browser versions have changed the way they filter 3rd party
cookies in iframes:

http://stackoverflow.com/questions/9930671/safari-3rd-party-cookie-iframe-trick-no-longer-working
http://groups.google.com/a/chromium.org/group/chromium-discuss/browse_thread/thread/91fb1bf55e483dc4

Previously, a POST in an iframe would allow the target of the POST to set cookies. However, this has changed, and those cookies are no longer allowed when 3rd party cookie blocking is enabled (the default setting). This causes them to completely break Basic LTI tools that use cookies to establish and maintain a session and all of the LMS systems enclose LTI Tools in an iframe.

Here is my Answer

The simple answer is that tools will need to go through a step where they attempt to set a cookie and then do a redirect back to themselves with the session as a GET parameter and check to see if the cookie is set. If the cookie is not set – they should open in a new window, passing the session id as a GET parameter and then setting the cookie and redirecting to one’s self one more time.

Yes – it sucks. And many tools just give up and don’t even bother trying to set a cookie within frame. If they notice that they are not the top frame pop open a new window with a GET parameter. Friendly instructors or admins placing the tool could make it easier and just tell the lMS to pop up in a new window. But the tools shold not assume this is the case and gently deal with being in an iframe or new window.

I think that in the future there will be two situations. (A) Relatively large tools that insist on a new window and immediately pop themselves out into a new window if they find themselves in an iframe. (B) Small widgets that happily live in an iframe but do not use any cookies at all to maintain session – they just use GET parameters or POST parameters on every screen to maintain the session state.

How to Build a Rich / Non-Trivial IMS LTI 1.1 Provider

PHP IMS LTI 1.1 Providers

I have a “hello world” provider in PHP here:

https://source.sakaiproject.org/svn//basiclti/trunk/basiclti-docs/resources/docs/sakai-api-test/tool.php

It is nice as it does all of LTI 1.1 as well as Sakai’s extensions documented here:

https://source.sakaiproject.org/svn//basiclti/trunk/basiclti-docs/resources/docs/sakai_basiclti_api.doc

I have built a few other PHP providers that do a little more and use a database with some Authz and different implementation approaches. I used these as examples for various advanced LTI workshops I have given:

http://ims-dev.googlecode.com/svn/trunk/basiclti/php-simple/adlist/
http://ims-dev.googlecode.com/svn/trunk/lti/lms/

The best example is the Moodle LTI 1.1 Provider written by Juan Levya. The Moodle provider is more rich and powerful than the Sakai provider (below). The Moodle provider does grade send-back as well launch and provisioning at either a course or a tool level. I would love to get Sakai’s provider to feature parity with Moodle’s provider.

http://docs.moodle.org/22/en/LTI_Provider
https://moodle.org/plugins/view.php?plugin=local_ltiprovider

https://github.com/jleyva/moodle-local_ltiprovider

Java IMS LTI 1.1. Providers

There are three samples of Java LTI code here:

http://ims-dev.googlecode.com/svn/trunk/basiclti/

They were the initial sample code that was developed as proof-of concept. The Sakai LTI 1.1. providers started with this:

http://ims-dev.googlecode.com/svn/trunk/basiclti/java-servlet/

And went quite a ways beyond it. So I would ignore the above and start with:

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-portlet/src/java/org/sakaiproject/blti/ProviderServlet.java

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-docs/resources/docs/sakai_basiclti_provider.doc

You will note that the code here:

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-util/src/java/

Is not even copyright Sakai – it is generic utility code copyright IMS and others – and it is more well-tested than the code in http://ims-dev.googlecode.com/svn/trunk/basiclti/java-servlet/ – so I would start with this as your utility code.

The connection between that generic Util code and Sakai APIs happens here:

https://source.sakaiproject.org/svn/basiclti/trunk/basiclti-common/src/java/org/sakaiproject/basiclti/util/SakaiBLTIUtil.java

I made sure that I kept the generic bits and non-generic bits 100% separate.