A Colonoscopy Story

I am generally a pretty healthy guy and so I generally don’t spend much time in Doctor’s offices for myself. I get a physical every 10 years whether I need it or not and the results are (thankfully) the same.

A few weeks back, it was time for my first physical (WHERE AGE > 50) (sorry for slipping into SQL there). My doctor went through all the normal things and everything was fine and nothing had changed since my last physical in 2001. That was all great news.

But then we had “the conversation” about how people over 50 need a Coloscopy. Given that I am in pretty good health and have no family history of cancer of any kind let alone colon cancer, I might have had a decent excuse to say “no thanks” and walk away.

I knew this conversation was going to happen and so in the days before, I thought about my good friend and dear colleague Bob Frost.

Memorial Facebook Page
http://www.dr-chuck.com/csev-blog/2011/04/bob-frost-memorial-service/
http://www.dr-chuck.com/csev-blog/2011/03/bob-frost-1952-2011/

As much as I wanted to run away and say “no”, I knew the right answer was “yes”.

So yesterday was my date for the Colonoscopy procedure at the University of Michigan Hospital. I had known other folks who had the procedure at UM Hospital so that made me extra comfortable.

The preparation started Thursday morning. Most folks say the prep is the worst part. This is the prep that we chose:

http://www.med.umich.edu/1libr/aha/ummiralaxprep.htm

I followed the prep almost perfectly. Wednesday night, I had a substantial meal with steak, soup, salad and a baked potato at Buddies in Holt at about 10PM. I wanted my body to have lots of pre-digested protein, fat and carbohydrates in the bloodstream to endure the 1.5 day fast. My last solid food was a Egg McMuffin at 8AM on Thursday.

I had an IMS LTI teleconference at noon to talk about RDF and XML bindings when the prep was supposed to start. I foolishly waited to the last minute to buy my pills and MiraLax and Gatorade. Then meeting started to go long. So I switched to my cell phone and drove to the pharmacy to pick up my materials. There was some detailed discussions going on as I picked up my supplies. I must have sounded pretty weird to folks in the other aisles at CVS when I started addressing the availability of RDF libraries for PHP. To be courteous, I dropped off the call to check out and then rushed back home to rejoin the call. I took the pills at 1PM (one hour late) in my car on the way home to get the procedure started. I got home in time to rejoin and finish the meeting from my home office. It was a great meeting and a lot was accomplished – I was glad I did not miss it.

At about 3PM I started drinking the MiraLax/Gatorade mix. The procedure said to wait until “something happened” or 6PM – but nothing “happened” and I did not want to fall behind schedule so I started chugging the 9X daily dose of MiraLax between 3PM and 5PM. By 6PM, still nothing was “happening”. I was starting to feel like a failure. But by 6:30PM, I was relieved that things started to “happen”. After that, they happened about every 30 minutes pretty much like clockwork.

For me, there was never any bloating, cramping, or discomfort. About every 30 minutes I would begin to “feel the need” and – would take care of business, and then go lay down and watch TV. I was not tired or uncomfortable at any point. But I was distracted enough to screw up and miss an IEEE Computer Editorial board teleconference at 4PM (Sorry Ron and Judith). I could have made the meeting – I just forgot because I was too focused at 4PM and worried that nothing was happening and I was falling behind schedule – and I hate to fall behind schedule.

By 9PM things had been progressing very comfortably and regularly every 30 minutes. I had to make a run to pick up some football tickets for a Friday night game from my friend John Liskey. He was at a bar about 20 minutes away from my house. So I waited until “something happened” and then rushed to the bar. There was construction on the route I took so it took longer than I had anticipated (uh-oh). At 9PM on a Thursday they were resurfacing Cedar Street and in a pipelined fashion, they were grinding off a lane, cleaning it, and laying down new asphalt and then finishing the asphalt – all in a amazing dance that took about 500 yards of machinery. It was beautiful engineering – but my mind was elsewhere as the traffic crawled along. I got through traffic and made it to the bar before I was completely uncomfortable – I rushed in, waved at John, and then ran to the restroom. I stayed at the bar for 30 minutes, and when it happened again, I knew I had 30 minutes to get home so I quickly said my good-byes and took a much quicker route back home.

By 10:30 things were slowing down to be more like once per hour. So I decided to try to get some sleep. I slept for a while and then needed to get up and then went back to sleep. It was OK – kind of like a west-coast redeye flight. I got enough sleep to trick my body into not being tired the next day. At 4AM, I started the next 9 doses of MiraLax plus Gatorade and pretty quickly we were back on the every 30 minutes schedule. By 5AM I desired to stop trying to go back to sleep and started to catch up on E-Mail in between sittings.

I was a little concerned with the “output”. Throughout the night, there were less and less solids and at about 6AM there were no solids at all. But the color was brownish and still not fully clear as specified in the procedure. Again I started to feel like I was behind schedule and might end up failing.

My “last allowed liquid” time was 9AM so I savored a cup of good black coffee (Loving my Bunn Velocity Brew Coffee maker) with a bit of sugar – but no cream. I did not want to end up with a caffeine headache during the day. By 9AM, the mean time between sittings was creeping back up to 45 minutes and the “output” was increasingly light, but it was not “clear and yellow” per the prep specifications. I was quite worried. At 9AM I felt confident enough to do some lifting and hauling and did an errand that took me about 45 minutes – no problem at all – by then the pace was slowing.

At 10 AM it was time to take the 1 hour drive to Ann Arbor and Michigan Hospital. So off we went. About 45 minutes into the drive, I started to get “the feeling” and would have been most happy if we stopped. But since we were running late, stopping was not in the cards. It was not really uncomfortable but I was really happy when we arrived at the hospital. And doubly happy that the hospital has restrooms as soon as you walk out of the parking ramp and into the hospital.

And the awesome news was that the “output” was perfect! It met all the prep specifications and then some. I was so happy – it was so pretty. I should have taken a picture and tweeted it! All the stuff I had read suggested a lemonade-color. But this was more of a pretty deep yellow color – more like egg-drop soup without the eggs. To me, it was so pretty because the last thing I wanted to do was disappoint my doctors and I certainly did not want to have to come back and try again another week. School starts Tuesday and it is hard to take two days off while teaching.

So we rushed to the Medical Procedures Unit five minutes late and checked in. I have been vague-tweeting because I did not want folks to get concerned. When they called my name, I made my last tweet and then gave my phone to Teresa and in I go to get my gown and lay down on my side.

Inside the unit, I changed into my gown and went to the bathroom one more time – and yes it was that awesome clear yellow color. I had achieved the specifications of the prep only 60 minutes before my appointment – but hey – it was on time and on budget – so I was very pleased.

There was a really nice clinician who wanted to ask me some survey questions for a study they were doing. When she got to the privacy part, she said, “The worst case that might happen is that if one of our servers were hacked, someone might figure out that you had a Colonoscopy.” I told her I was not too worried about privacy since I had tweeted that I was having one about 15 minutes earlier. It was a cool survey and I told her I wished I had the “as-built” data to inform my prep. She laughed. I told her that during the prep, I really wanted to do it right and I spent a lot of time on Google trying to find anything that might give me some kind of summary of what should happen when as you do the colonoscopy preparation. They were great telling me what to do – but not so good telling me what to expect and how the prep would progress. I also said that their timing was perfect – although I might have wished it finished 2 hours earlier. If I do this next time, I might start the process at 11AM. But of course, because of the IMS LTI meeting I did not start until 1PM – so I *was* an hour late starting – so the schedule slip was my fault (Mark, Greg, Lance, and Colin contributed to the schedule slip unknowingly).

Once I finished the survey, it was time for the IV – this was the scariest part for me. I have a weird relationship with needles. I can handle the pain but somehow I am scared of them (except of course when I am getting a tattoo). One time many years ago, I fainted after I stood up having blood drawn. So since then I tell anyone about to put a needle into me that I am a wuss. I think that by me telling them that I am a little scared makes it OK for me because since I started telling them I was scared I have had no problems. But my veins are too small and it took her four tries to get me locked and loaded. Gaak. It was not painful but more nerve wracking than anything else. But the clinician kept asking me questions about the prep and it distracted me nicely. Now that I think about it – maybe she was not doing a survey at all – she was just they’re to distract me – Hmmm. Whatever. It worked. And we had the IV.

The nurse asked me all the stock questions and that went fine. Then we were went over all the risks and that was fine. Then we got to the part where I was going to sign it all and I had a little concern. I felt uncomfortable signing a document that said we were going to look in and remove whatever polyps they would find so they could do a biopsy. The one other person I knew that had this done – they found one polyp and removed it.

What concerned me was that I was signing a “blank slate”. For me, taking a quick look and removing 1-2 polyps for diagnostic purposes was OK and understandable. But what if they found more? How many would they take out? It seemed to me that at some point when you go from 1-2 removals to 10+ removals (would be very rare) we are switching from a low-risk diagnostic procedure to a small major surgery. And frankly, I wanted to understand what is going on if there was to be a major surgery – I might want a second opinion or something and I wanted to be emotionally prepared if there was something major. What made me most uncomfortable was the sense that they would put me out and then I woke up they would tell me if I had a routine diagnostic procedure or a (what seemed to me to be) small major surgery. I was not ready to let that decision be made while I was under anesthesia. That was my decision to make and I wanted to be awake and presented with the evidence so I could participate in that choice.

This was the first time they joked and mentioned the phrase “Doctor’s make the worst patients.” I went through the story of my concerns four times. First I explained it to the clinician doing the survey and she told me horror stories of colon cancer. I then told her the story of Bob Frost and told her I was properly respectful of colon cancer but wanted to be part of any major surgery decisions. The clinician was nice about it. Then the receiving nurse came in and we had the same conversation. She just said, lets sign the consent form as is and put a sticky note on it. I agreed.

Then the nurse supervisor came in and we did the story again. She was somewhat stern but understood. I am sure that the problem is that if they need to bring the doctor out to talk to me, it messes up their timing a lot. I understand efficiency concerns but I was not gong to agree to something I did not understand. So I laid there for what must have been 45 minutes all prepped and ready to go telling this story to folks over and over again. I watched others being wheeled in ahead of me whilst I waited (all the while with no laptop or iPhone to pass the time).

Since I was insisting on discussing the procedure – I spent more time waiting. Laying in the hospital bed looking out the window with nothing to do, I spent a lot of time thinking about Brent, his surgeries, and the fact that he is working through some tough issues.

Finally the doctor came in. She was awesome. I explained my concerns about being comfortable with removing a polyp or two for a biopsy and not comfortable with declaring “open season” if there were a small army of polyps when they got in there. She assured me that if there were more than 1-2 polyps that she would stop at 2 – so I agreed and we tore up the sticky note and were good to go.

In a few more minutes they wheeled me into the room with the nurses and the doctor and I rolled over on my left side. Everyone was joking, light-hearted and laughing. They were amazed that I scheduled a colonoscopy on my birthday. I told them it was my birthday present to myself. I told them that if it was my fiftieth birthday it would show how much I wanted the colonoscopy. But instead I told them that it took four years before my physician “caught up with me”. I also told them that I needed to get this done before classes started at University of Michigan on Tuesday September 6.

The put the drugs in my IV and in about 15 seconds, I said – “Things are a little out of focus” and then 5 seconds later I gently went to sleep. The next thing I know I am back awake with no discomfort at all – I mean zero – no discomfort. I am looking at a computer monitor showing video from the camera. I don’t know it it was video being replayed or it was live – but I was seeing them pulling the camera slowly back out of my intestine. It was so awesome – I wish I could have had the whole video – you could see the folds and chambers of the inside of my colon – it was the coolest thing.

It turned out that I had one polyp (completely normal) and they took it out to examine (completely normal). They wheeled me into recovery where Teresa came in. They gave me some pictures of the polyp and the test results. I got my iPhone back and tweeted post-procedure with a picture of my scary IV that was still in. In about 20 minutes I was ready to get dressed and leave. There was no lingering pain no bloating – absolutely no abnormal feeling at all. It was literally as if nothing had happened.

So we left (I of course was not driving) and went to have lunch at the Caribbean Jerk Pit for lunch. Since I had just had a Colonoscopy I backed off and only had medium spiciness. I figured to take it easy. Because of the timing of the steak meal and Egg McMuffin, iI never experienced any hunger. I am a bit weird because I skip meals all the time and don’t feel hungry – so your mileage may vary on the hunger thing.

One the way back to Lansing we stopped at the Secretary of State office to renew my plates (it is my birthday remember) since I had waited to the last minute for that as well. Then we went to Culvers for some custard.

Then we went home and changed and went to the football game at 7:30PM. I would point out that after I went to the bathroom about 11:30 AM right before I went into the procedure, I had no more urges to go to the bathroom at all. The timing and dosage of the MiraLax cocktail was perfect.

I next went to the bathroom normally on Saturday morning (20 hours after the colonoscopy) and everything was normal – no pain, no bloating, no urge, no nothing.

All in all, it was an amazing experience and I am nothing but impressed with the medical professionals at the UM Hospital who do this procedure. And it is an awesomely good feeling to know that my colon is as healthy as the rest of me. And I certainly have Bob Frost to thank to make sure I had the backbone to actually do this and not delay it.

Obviously if you are looking at this considering your own procedure, all situations are different. Other than being a bit chubby and sitting in front of a laptop computer too much, I am in excellent health and this was a *diagnostic* procedure as part of a physical exam. If you have health issues or are on medication you may have a quite different experience. I guess what I would suggest is not to be worried about a Colonoscopy procedure at all if you are in good health.

I hope that if you have read this far, (a) you are not too grossed out, and (b) you have found this helpful.

Comments and suggestions welcome.

The Funniest Wikipedia Article I Have Ever Seen: Outcomes Based Education

Perhaps my sense of humor is completely warped. Perhaps I am just a mean and hateful person looking for the negative in everything. But I found the following WikiPedia article totally funny. It is really the first example I have seen where WikiPedia contains content while appearing to be factual has a cleverly encoded contrarian perspective hidden inside.

http://en.wikipedia.org/wiki/Outcome-based_education

As best I can tell, this is a classic Marc Anthony, “I came to bury Caesar, not to praise him.” speech. While appearing to be factual and neutral, the underlying theme of this writing is that Outcomes Based Education taken to excess is a mistake. But the “this idea is overhyped” message is cleverly wrapped in highfalutin passive voice academic writing to the point that nearly masks the intent of the writer.

I think that many of the sentences in these paragraphs are dripping with sarcasm. The real message is just barely visible under the surface of the prose. I copy the verbatim text from Wikipedia here in the off chance that someone will edit the article to remove the delightful, deadpan sarcasm.

Outcome-based education (OBE) is a recurring education reform model. It is a student-centered learning philosophy that focuses on empirically measuring student performance, which are called outcomes. OBE contrasts with traditional education, which primarily focuses on the resources that are available to the student, which are called inputs. While OBE implementations often incorporate a host of many progressive pedagogical models and ideas, such as reform mathematics, block scheduling, project-based learning and whole language reading, OBE in itself does not specify or require any particular style of teaching or learning. Instead, it requires that students demonstrate that they have learned the required skills and content. However in practice, OBE generally promotes curricula and assessment based on constructivist methods and discourages traditional education approaches based on direct instruction of facts and standard methods. Though it is claimed the focus is not on “inputs”, OBE generally is used to justify increased funding requirements, increased graduation and testing requirements, and additional preparation, homework, and continuing education time spent by students, parents and teachers in supporting learning.

Each independent education agency specifies its own outcomes and its own methods of measuring student achievement according to those outcomes. The results of these measurements can be used for different purposes. For example, one agency may use the information to determine how well the overall education system is performing, and another may use its assessments to determine whether an individual student has learned required material.

Outcome-based methods have been adopted for large numbers of students in several countries. In the United States, the Texas Assessment of Academic Skills started in 1991. In Australia, implementation of OBE in Western Australia was widely criticised by parents and teachers and was mostly dropped in January 2007. In South Africa, OBE was dropped in mid 2010. OBE was also used on a large scale in Hong Kong. On a smaller scale, some OBE practices, such as not passing a student who does not know the required material, have been used by individual teachers around the world for centuries.

OBE was a popular term in the United States during the 1980s and early 1990s. It is also called mastery education, performance-based education, and other names.

Yeah, what might be some of those other names that might be used to describe OBE?

Yes, I know – I am just like those old guys on the Muppet Show – always finding something to be grumpy about :).

P.S. I am not anti-Outcomes based education – i think good teaching draws from a lot of techniques. There is no single silver bullet that “solves teaching”. All too often experts pick one technique and then run around like Mario with his hammer in Donkey Kong just banging the same hammer over and over everywhere they go. Real teaching is very dynamic and requires a good teacher to use the right technique at the right time.

Video from SF Startup EDU Weekend

Back in June, I went to a Startup Weekend (sfedu.startupweekend.org/​) event at Grockit (www.grcokit.com) headquarters in San Francisco when about 80 people spent a weekend developing ideas for a startup. I pitched the notion of an App Store and working with Roby John from Taptolearn, Aamir Poonawalla we kind of hacked up a cool IMS LTI launch to an iPhone app (video below).

They made a really cool promotional video for the weekend here:


Here is our demo video (no audio):



A Summer Month of Code – Learning Tools Interoperability in Sakai

I am really pleased with the new Basic LTI and IMS Common Cartridge 1.1 support in Sakai 2.9. If you want to skip the boring detail below – just watch this video and send it to your friends:



http://vimeo.com/27113903

Here is the back story.

Back on July 2, I decided to build a whole new capability to support Basic LTI and IMS Common Cartridge 1.1 in Sakai. I kicked it off with the following blog post:

Coming Back Home to Sakai 2.9 (July 2)

I talk about how Sakai started out with the first shipping support for IMS Basic LTI over two years ago, but the support from the other vendors had matured very nicely and our support was looking a little oversimplified and creaky. With my desire for Sakai 2.9 to be a nice, fresh relaunch of Sakai with its new Portal and Rutger’s Lesson Builder, it seemed a perfect time to dive into building a state-of-the-art Basic LTI support for Sakai and prepare Sakai to support additional new LTI features coming from the IMS LTI working group thanks to Lance Neumann of Blackboard and Greg McFall of Pearson.

I had a deadline of August 1 because Chuck Hedrick and Eric Jeney of Rutgers were madly working on improving Lesson Builder for a Fall rollout at Rutgers and inclusion in the Sakai 2.9 release. Chuck built IMS Common Cartridge 1.0 support and all of IMS CC 1.1 support except the Basic LTI part. Without Basic LTI support in Lesson Builder, we could not claim compliance to IMS CC 1.1.

So I needed to build a new capability for Sakai, plug it into Site Info, build a new administrator tool, and then plug it all into Lesson Builder’s authoring environment and import code. And it all had to be done by August 1 to give Chuck enough time to test it for production at Rutgers.

This resulted in a sprint for the past four weeks while I coded on planes, in my home office, on a camping trip, at the joint Sakai/Jasig meeting in New York, during cab rides to and from airports, during the Blackboard Developer conference. You can even see a mess up at the end of my recorded BBDevCon presentation because I forgot to test a demo that John Fontaine and I were doing because I was coding on the new Sakai features right up until five minutes before my talk. It was a 24×7 code sprint for four weeks.

Here is my presentation at BbDevCon11 – the disaster is towards the end. It all finally worked and the source of the problem was filtering of port 8080 – but it had me flustered for a while thinking all Sakai servers in the world were down. You can hear Fontaine in the background giving me a good-natured ribbing as I struggled to find a Sakai server that worked. I eventually just did the launch of CourseSites from Michigan’s CTools production server (because it did not use port 8080):

IMS Standards at BbDevCon11

You can also see a demo of the Fontaine/Severance hack wthout a disaster in the middle here:

http://vimeo.com/26310497

I do need to apologize to everyone who I interacted with during July if I seemed distracted and itchy to get back to my laptop. I was not just writing code, I was also building a UI and persistence framework and then using that framework to build my tools.

What We Accomplished

I am mixing the work of Chuck and Eric with my own work.

  • Built a new “External Tools” extension to Site Info that lets you manage your tool installations (Full LTI coming soon) as well as make tool configurations. These tool configurations have URLs that can be used elsewhere in Sakai like in Resources, Wiki, Melete, anywhere that you can put a URL. You can spread the URLs around and still manage all the tools (change keys settings, etc) from one central location in Site Info.
  • Added a new tool for System Administrators to monitor usage of external tools across the system, and reach in and tweak settings on a tool installation or a tool configuration. They can create site-wide tool installations that appear in all sites.
  • Lesson Builder now has the ability to install, configure and launch external tools right from within the Lesson Builder UI as another resource type. In particular, instructors can set per page values for LTI launches like custom parameters very easily.
  • Lesson Builder imports Basic LTI items from IMS 1.1 Common Cartridges. Since cartridges do not have keys and secrets, the import either hooks the imported items to an existing installed tool with a key and a secret or makes a new tool with an un-configured key and secret. There are several workflows for getting a key / secret set on a tool ranging from using the Administrator tool, Site Info, or even when an instructor launches a partially configured tool. The flow was designed to make it as natural for the instructor as possible.

Everything is working and demonstrated in the above video. I am sure that there are little rough edges with our flows and screens UI wise and we can tweak those as more people look at the code (now sitting in the trunk of Lesson Builder, Site Info, and Basic LTI). We are very much open to suggestions as folks review the work. But hurry because code freezes are coming up quickly.

Technical Details

First, I had to decide on which display technology and persistence to use to build my new tool in Sakai. My preference for a new tool is by far to build a JSR-168 portlet using JSP and JSTL as the presentation layer. My Basic LTI portlet avoided the persistence problem by using Sakai’s Placement properties. But this tool needed to be a frame-based helper because it needed to work with Site Info and Lesson Builder as a helper. So a portlet was not possible. Also since this needed lots of tools independent of a particular placement, I had to break down and make some database tables – so choosing a persistence layer was necessary for this project.

I don’t like heavy frameworks like JSF, RSF, and Wicket because they force tons of extra Java and XML just to print out “Hello World”. I was going to have a very complex data model and very dynamic UI because the administrator gets to control what fields (like popup and custom parameters) the instructor can configure on each LTI tool placement. I did not feel like encoding it all into a bunch of getters, setters, and bean properties, having to code everything 4-5 times when I wanted to change something.

So I decided to use Velocity augmented by my own presentation layer. I had built a really cool presentation layer when I wrote the Basic LTI code for ATutor where I made it really easy for myself to make a complex dynamic UI. I called that notion the “Form Oriented Object Relational Model”. I documented an early PHP version of FOORM in this blog post:

ATutor with Basic LTI is Released

I decided instead of writing thousands of lines of PHP to implement the views for ATutor, that I would make a library to build the HTML for the sets of fields I needed. This is also a bit of a riff on the way Moodle handles forms generically. Moodle provides a set of form handling routines that enforce consistency of look and feel and keep the details hidden behind an abstraction. It worked but was a little too chatty with too many lines needed for each form field. The Moodle model is similar to Glenn’s Ambrosia approach.

I took the opportunity to rewrite my FOORM code in Java and clean it up and make it work more generally.

FOORM Presentation and Persistance

In FOORM you build a model using an array of strings that look as follows;

  static String[] CONTENT_MODEL = { 
      "id:key", 
      "tool_id:integer:hidden=true",
      "SITE_ID:text:maxlength=99:label=bl_content_site_id:role=admin",
      "title:text:label=bl_content_title:required=true:maxlength=255",
      "frameheight:integer:label=bl_frameheight",
      "newpage:checkbox:label=bl_newpage",
      "debug:checkbox:label=bl_debug",
      "custom:textarea:label=bl_custom:rows=5:cols=25:maxlength=1024",
      "launch:url:hidden=true:maxlength=255",
      "xmlimport:text:hidden=true:maxlength=16384", 
      "created_at:autodate",
      "updated_at:autodate" };

This is kind of like a Rails model in colon-separated form. One thing that FOORM assumes is that you will be internationalizing your UI and so it even encodes the field labels..

You can build an input form with the following lines of code:

String formInput(Object previousData, String [] modelInfo, Object resourceLoader)

This returns a set of input tags, one for each field, repopulated from the previous data (can be Properties, Map<String,Object>, or ResultSet) and using the provided resource loader.

You embed the form tags in some Velocity as follows:

<form action="#toolForm("")" method="post" name="customizeForm" >
    $formInput
    <input type="hidden" name="sakai_csrf_token" value="$sakai_csrf_token" />  
   <input type="submit" accesskey ="s" class="active" name="$doAction" 
        value="$tlang.getString('gen.save')" />
   <input type="submit" accesskey ="x" name="$doCancel" 
        value="$tlang.getString('gen.cancel')" 
        onclick="location = '$sakai_ActionURL.setPanel("Content")';return false;">
</form>

There are utilities to validate incoming data, and extract it into the right objects and type for inserting into a database. with a very few lines of code. Here is an extract from request properties into newMapping and insert into the database:

    HashMap<String, Object> newMapping = new HashMap<String, Object>();

    String errors = foorm.formExtract(requestProperties, formModel, 
        rb, true, newMapping, null);
    if (errors != null) return errors;

    String[] insertInfo = foorm.insertForm(newMapping);
    String makeSql = "INSERT INTO " + table + " ( " + insertInfo[0] + 
          " ) VALUES ( " + insertInfo[1] + " )";
    final Object[] fields = foorm.getInsertObjects(newMapping);
    m_sql.dbInsert(null, sql, fields);

It supports a very REST-style property bag approach to data models except that all the fields are properly and fully realized as database columns so they can be searched, selected, grouped, and ordered with as sophisticated query as you want to create.

FOORM is designed to be extended so you have the generic Foorm and you extend it to make a SakaiFoorm that captures all the Sakai-specific rules for generating fields, handling internationalization, etc.

I even dispensed with the “CREATE TABLE” scripts – since Foorm knows the entire model, it can make the tables and sequences automatically. It know about the various rules for fields in MySql, HSQL, and Oracle. The AutoDDL code looks as follows:

 String[] sqls = foorm.formSqlTable("lti_content", 
     LTIService.CONTENT_MODEL, m_sql.getVendor(), doReset);
        for (String sql : sqls)
          if (m_sql.dbWriteFailQuiet(null, sql, null)) M_log.info(sql);

I could make this more succinct with an overridable method in Foorm. Foorm still needs to learn about changes to data models and automatically generate “ALTER TABLE” commands. I will write that code when I need to do a conversion when I release a future version of DBLTIService is released and I need to expand the model.

I think that Foorm has a lot of potential. It is kind of a weird combination of a partial presentation layer and partial object relational mapper. It is focused on providing useful library utilities that let the developer get as tricky as they want, making sure that the nasty repetitive common tasks take as little effort as possible.

You can ask me “Why not Hibernate?” or “Why not Spring JDBC?” at a bar sometime. Be prepared to hear my voice raise a few notches as I rail about “Hibernate’s is an overbearing over-engineered steaming pile of…(oops did I say that out loud?)” or “Spring JDBC makes SQL portable in Java only to the level that they solve the trivial problems and punt on anything mildly interesting or complex…”. If I have had a few beers before the conversation starts, it will be more (or less) interesting.

Oh by the way, Foorm provides a nice, generic way to handle paged SQL queries. I still need to build advanced WHERE clauses, and support ORDER BY and GROUP BY operations.

This is why Foorm is still a prototype and I don’t want anyone else using it for a while. I feel that any library / framework should be used by its creator for a long time, solving lots of real-world problems before they claim to have anything truly reusable. But I figured I would show folks what I am thinking.

I have been experimenting with finding the right combination of power and convenience for both display layers and persistence. Before FOORM there was OMG-ORM (http://omg-software.com/) – it too was an exploration of the balance between powerful capabilities and intrusiveness as well as trying to make things intuitive. FOORM is much simpler than OMG and I find it more natural to use.

Summary

In short, it has been a heck of a month of pure code to the exclusion of nearly everything else. I built a UI framework and persistence layer and then built a tool on top of my new frameworks. Chuck Hedrick and Eric Jeney of Rutgers sprinted on Lesson Builder (pausing along the way to take the “Upload web site” code from Adrian Fish of Lancaster). And In the last few days, Chuck, Eric, and I spirited to connect it all together to make it so Lesson Builder has great, extensible Basic LTI support in both its authoring and import.

It means that Sakai 2.9 really catches up in many important ways to Blackboard, Desire2Learn, Learning Objects, Jenzabar, Instructure, and OLAT in their support of Basic LTI and Common Cartridge. Blackboard is still the only CC 1.1 exporter – so they have the lead on that one and I don’t think we will have that until next year sometime in Sakai. Export is hard to make import easy – my hat is off to Blackboard for making it work. Blackboard’s CC 1.1 export is why they were the first in the IMS ring of tattoos.

But for me export is lower priority than the next round of innovations for LTI coming from the IMS LTI Working Group led by Greg McFall of Pearson and Lance Newmann of Blackboard. This work is ready to pop and we should see some real progress on adding auto provisioning and run-time services to LTI implementations in the field. The spec leads have done a great job and the new Site Info work is ready to be extended to add these new features as we get the standards out the door.

It might be a good time to start to watch IMS closely. You only have a few more days to make the IMS meeting in Redmond, WA at Microsoft’s HQ. You have to pre-register – so hurry!

http://www.imsglobal.org/aug2011microsoft.cfm

Sakai 2.9 External Tool Support (IMS Basic LTI)

I just updated Sakai’s trunk to a new Basic LTI Administration tool. This tool is intended to be added to the !admin site and installs itself as an extention to Site Info.

It allows administrators and instructors to make Basic LTI Tools and Content Items. Each tool can have many content items. Content Items have Launch URLs that can be used anywhere within Sakai like Resources, the iFrame Tool, Melete, literally anywhere.

Here is the JIRA describing the changes: https://jira.sakaiproject.org/browse/BLTI-119

In particular, this lays a tool and service foundation for Lesson Builder to easily add an “Add External Tool” feature both for its authoring and when it is importing IMS Common Cartridges (version 1.1).

This also lays a foundation / starting point to add support for auto-provisioning capabilities for Full LTI.

This design is informed by the great work Basic LTI work in Moodle, ATutor, Blackboard, Desire2Learn, and Instructure’s Canvas. As more experience is gained in the use of BLTI in LMS systems, the UIs are starting to converge. I like the new terminology where we call these “External Tools” rather than “Basic LTI Tools”.

There is still work to be done and comments are welcome. I am trying to get it ready in time to fit it into Lesson Builder for 2.9.

Tech note to remember this command:

svn merge -r94350:HEAD https://source.sakaiproject.org/svn/basiclti/branches/SAK-20774

Comment: Why Fairy Dust?

I was reading a blog post about maching learning and fairy dust:

http://stdout.be/2011/07/18/machine-learning-fairy-dust/

It prompted me to make the following comment as I am a general critic of the “fairy dust” approach to problem solving. Of course – it is just a rant (one of my more light-hearted rants).

The problem is all wrapped up in how funding / opportunities are distributed. The folks with money who want to affect the future positively are generally not brilliant in any domain other than making money. So they consult “experts” as to what is the “next big awesome thing” – or as you put it, what is the “next fairy dust”. Once an expert convinces a person with the money of the veracity of the particular brand of dust – it becomes dogma. And those who want money salute the the “fairy dust” and make loud protestations as to the general amazingness of “fairy dust”. They also shout down “non-believers” to show their undying loyalty to the fairy dust. They give invited keynote speeches about the future of dust.

By doing these things, they get funding. And because they are funded – the market assumes they must be right because no one would fund a “bad idea”. Slowly public opposition to fairy dust goes underground until the funds are exhausted (usually having virtually no impact). Since everyone is so ashamed at the amount of waste in the name of “fairy dust”, no one ever goes back and checks who was wrong and where they went wrong. It is just easier for the “dust riders” to lay low for a while and then re-emerge to flow to the next source of funding “fairy dust”.

About: Tattoo

Jeff Longland made a post about Blackboard and mentioned my Tattoo:

http://jlongland.wordpress.com/2011/07/13/a-call-to-blackboard-openness/

Here is my comments.

Jeff – Interesting post. I do agree with your first point – I think that creating a more porous boundary between Blackboard and its community is in their best interest and will happen in time. I disagree with your second point somewhat. Simply calling for the release of source code is kind of pointless. Open Source is just a small part of an overall equation. Companies like Instructure do it from the beginning and that is good – but rest assured that there are plenty of bits that Instructure does not release. MoodleRooms has their secret sauce that does not get released. Simply getting your hands on a ZIP file with some Java code almost means nothing unless it is embedded in an open ecosystem. I am sure that the Atlassian release of their code was just a small step in an overall evolution of their culture that took many small steps. Releasing source is not a step function.

That said, I think that it would be a cool step forward if folks who had proprietary software had forums where they could share their production experiences, good and bad. One thing good about Sakai is that our warts are on display in public lists. When someone has a crisis, we all have it with them and learn together. I understand it is scary to let those kinds of conversations happen in the open – but it is also freeing after you get used to it.

In terms of the Tattoo – it is not just a “Blackboard” Tattoo – the Tatoo has a Sakai logo at the center circled by a ring of smaller Tatoos. The ring is a logo for each Commercial or Open Source LMS that releases a Certified Basic LTI 1.0 support in their core product (i.e. not a patch, building block or add on – part of the core code). I am staging the tattoos so my shoulder is not all healing at the same time. The first two tatoos were an IMS logo and a Blackboard logo. The next logos that will be added in the next few weeks are Desire2Learn, LearningObjects, OLAT, and Jenzabar. As best I know, Instructure has Basic LTI Consumer in their repository but no one has seen it in a release – and it is not yet certified. Also Moodle has a module (basiclti4moodle) but it is not in the core code base so Moodle is not yet eligible for a Tatoo. But we are hopeful Moodle will ship support for BLTI in 2.2 so they too will be on my shoulder.

If Moodle and Instructure release Basic LTI, that will complete my tattoo as there are 8 slots in the “ring of compliance”.

And then of course I will write another book and use that Tattoo as the cover of that next book.

http://www.dr-chuck.com/sakai-book/

You can see the Sakai Tattoo without Blackboard or IMS on the cover of my current book :)

You will never get anywhere if your first thought is, ‘Who will I blame if this goes wrong?’

I just came up with this saying this morning.

You will never get anywhere if your first thought is, ‘Who will I blame if this goes wrong?’

I don’t have much else to say – except perhaps buy my book about Sakai and Open Source.

http://www.dr-chuck.com/sakai-book/

This morning is mostly coding after I catch up on Twitter and Google+.

Coming Back Home: BasicLTI for Sakai 2.9 – SAK-20774

It is kind of weird. Over the past two years, I have been putting a lot of energy into IMS Basic Learning Tools Interoperability and even have a new series of tattoos that are about IMS BLTI that are healing as I write this blog post. It has been my focus for a long time.

I started out a long time ago writing a Tool for Sakai 2.7 and this was one of the early crop of Basic LTI tools that staked out the territory. I kept the Sakai tool simple and direct, trying to also make it a nice example of how to write a portlet inside of Sakai. Since BasicLTI uses iFrames internally it was really nice not to have to solve the iframe-within-an-iframe problem in Sakai.

I wish more folks would either build new tools or convert existing tools to be portlets in Sakai. The Basic LTI tool really worked the kinks out of JSR-168 support in Sakai.

Once I had the Sakai tool basically functional, I spent my time trying to get the market interested in Basic LTI.

The path led to Barcelona where Marc Alier, Jordi Piguillem Poch and Nikolas Galanis built a Moodle Module for 1.9.

The great folks at Desire2Learn put it into their release 7.4.2 and announced it at Educause in November 2009. That was a great leap forward.

Then I decided to work with the folks from OLAT at the University of Zurich and I build a OLAT Basic LTI tool and visited Zurich to get it integrated into OLAT 7.0.

Both Learning Objects and Jenzabar added Basic LTI to their LMS systems.

Then we invented the IMS Basic LTI Extensions to send grades back to the LMS and allow a full roster pull from the LMS over REST web services.

I went back and added the REST web services to Sakai 2.8 and we built them into Moodle 1.9 and then ported everything to Moodle 2.0 with a whole new user interface that dealt with these services.

All the while we were refining the user interface.

Then late last year I decided to add Basic LTI support to ATutor. It was fun to work with the ATutor team and since they had just implemented IMS Common Cartridge 1.1, ATutor’s internal structure was well suited for Basic LTI and we build what I think was the best UI for Basic LTI.

Then early this year Blackboard released Learn 9.1 Service Pack 4 with support for Basic LTI. Their UI had some clever innovative tweaks and gave the instructor and admin the ability to look at and monitor where the active links were happening and find broken links, etc – it was very cool.

Then about a month ago Marc, Jordi, Nikolas and I ended up in a sprint to try to make Basic LTI into the trunk of Moodle 2.1 – we missed it – but should make it into 2.2. As the Moodle Central team reviewed our code, the biggest gaping whole was the lack of suport for import/export. So we (mostly Nik with help from Jordi) came up with an awesome UI and workflow for import and thought it through very deeply.

Blackboard also supports import of IMS CC 1.1 so they have it thought through as well. Seeing the Blackboard 9.1 SP4 UI was helpful as was the long discussions about Full LTI with Greg McFall of Pearson and Lance Neuman of Balckboard in the IMS LTI Working group.

So, the Moodle Backup/Restore turned out pretty nicely I think.

And then as real large companies like MgGraw-Hill, Pearson, Cicso, K12.com started looking at Basic LTI – they found a few tweaks that are needed in the area of URL remapping and instructor access to custom parameters. Look for an updated Basic LTI 1.1 soon – if you are interested in seeing it early – you need to come to the working group as IMS does not publish specs until Public Draft phase.

And for me, I also spent the last six month on the Sakai 2.9 portal and Chuck Hedrick’s Lesson Builder.

In am amazing sprint, Chuck built IMS CC 1.0 import, Blackboard import, and IMS CC 1.1 import into Lesson Builder. But LessonBuilder does not yet understand Basic LTI.

And so I find myself about 80 days away from Sakai 2.9 ode freeze and I take a look at our Basic LTI support compared to everyone else in the market place and realize that I have taken care of nearly every other LMS in the marketplace and left Sakai’s support for Basic LTI as nearly the clunkiest implementation in the market (but very elegant under the covers).

The advantage I have of having worked on/with every other LMS on the planet is that I now know what I like and know what the UI and requirements really are. I really feel that the ultimate UI that I want to build is a combination of the features in ATutor, Moodle, Blackboard and D2L.

So I have opened up a JIRA for the work:

https://jira.sakaiproject.org/browse/SAK-20774

And I have made a branch for the work.

https://source.sakaiproject.org/svn/basiclti/branches/SAK-20774/

For some strange reason I am a little nervous because I am building a bunch of new Sakai capability from scratch. I am sure I will get over my nervousness once code starts laying down and things start to make sense.

I am also nervous I guess because now I want Sakai’s Basic LTI support to be a completely awesome example of how to do it right – so puts a little extra pressure on me. :)

Google Adopts User Interface Design from Sakai 2.9!

Google now has a black top bar with sans-serif white and grey text that looks like Sakai 2.9’s Neo Portal! (Click on image to see full size screenshot).

What is next? Rounded corners and a pool of water background for Google?

I would say that Gonzalo’s glowing blue for the selected item and our speedy-drop-down nav and expando-matic still puts Neo well ahead of Google in terms of UI goodness. I would say that it is good for Google to continue to aspire to UI greatness, using Sakai Neo as its roadmap.

(To be fair) Neo took its look and feel cues from Sakai OAE, and Sakai OAE ripped a lot of its look and feel from Twitter, I think – or perhaps Twitter took its latest look and feel from early OAE work. It is so confusing to keep track of who borrowed ideas from whom.

P.S. But seriously, it is nice to see a bit of convergence in these UI’s serving common purposes. It is all good for the users IMHO.