Category Archives: Uncategorized

What are the Key Challenges for the OER Movement?

The OLNet project (www.olnet.org) is assembling a list of the key challenges facing to Open Educational Resources. They ask the question, “What are the Key Challenges for the OER Movement?” in their blog post at http://olnet.org/node/639

This is my response.

I think that the single largest challenge in OER is the complete lack of a standardized interchange format and free/open software to create, manage, edit, and remix open educational content.

Until we have rich and powerful editing tools in the hands of the people creating the content, we will never reach “escape velocity” in the OER space. We will continue to pay large sums of money to accumulate large repositories of expensively gathered/produced materials that are uneditable.

We need tools that allow teachers to manage their materials in a way that will enhance their teaching as they teach and then as a trivial side effect, produce high-quality reusable and editable content.

Once we accomplish this a repository has a format for its artifacts and many repositories exist (i.e. like YouTube, uStream , Vimeo, etc etc) then we have the ability to actually make OER part of the teaching cycle rather than something we do long after the teaching cycle is complete.

This is a challenging task and will require significant sustained effort. There are four formats that come to mind that might serve as OER exchange: (a) SCORM, (b) IMS Common Cartridge, (c) Connexions CNXML, and (d) PowerPoint.

I add PowerPoint because of the four, it is the only one that does a really good job of remixing. While everyone ridicules PowerPoint, you must admit that it is nice to be able to send a complex, structured content collection to virtually anyone on the planet and know that they can read it, edit it and convert it to a number of useful formats. It is not because PowerPoint is an amazingly elegant file format, it is simply because we all have software to read and edit the format on our computers.

Lets briefly critique the other formats and their shortcomings.

SCORM – This format is aimed at instructional design, not teaching – it is great for training but not so good for teaching. The problem is less the file format and more the ecosystem of products aimed at a training centered market that uses a write-once run without modifications pattern. Also as more advanced features of SCORM are used, the resulting packages are less and less interoperable.

IMS Common Cartridge – IMS CC understands teaching and is designed to allow remixing, but since it is focused on maintaining interoperability of content, it is rather conservative in its scope. It is expanding its scope slowly but it needs to speed up if it is to become a real OER format that teachers would author their original courses in IMS CC. Also there are no good free open source tools that read and write IMS CC. CLosed source tools are better than nothing, but leave innovation in the hands of the software owners.

Connexions CNXML – This is a good start as a format, but it is far too wrapped up in the specialized server code developed by Connexions. Also, the CNXML is a little too focused on book-style resources and sequential ordering. CNXML would need to evolve to be able to represent the wider ranger of interoperable OER materials. I wish Connexions would be funded to start over and build a portable desktop application for authoring and then put a much simpler server infrastructure behind it rather than putting all the rich capabilities in their server software.

PowerPoint – rocks but its pedagogy is a bit limiting.

Software

What would the software look like? Frankly the closest I have seen to the right software is SoftChalk (www.softchalk.com). It is a word-processor like interface, uses its own internal format, but can export to SCORM, IMSCC, HTML and many other formats.

The weakness of SoftChalk is that it is commercial and as such its requirements are driven by what sells. And OER is not a significant enough marketplace yet to cause commercial product roadmaps to bend in the right direction.

So I think we need to write our own.

Conclusion

Building such a format and software to support the format would take an amazing leap of faith. It would take millions of pounds/euros/dollars to jump-start such a project to the point that it worked well enough to build a self-sustaining community of users and developers who could maintain and innovate in the space.

Given that all the funding seems to be going to building another repository or analyzing usage logs, it is not likely this will ever get funded. But that is the nice thing about grand challenges – they stay grand.

P.S. The second biggest problem is the ability to link content together around learning objects like Khan Academy and enable highly dynamic learning paths through content with great tracking. Maybe the other great challenge is an open, extensible version of Khan Academy.

P.P.S. The third biggest challenge is to have a learning management system that allows for the dynamic formation of cohorts of learning around web content. Such a system needs to have a light touch UI-wise and be easy to figure out. My favorite example here is Edmodo – but again, closed, proprietary, and with an intent to make money off its captive customers. So again, we need to write our own open extensible Edmodo.

Educause: Openness Discussion Session

I will be participating in the Openness Discussion at Educause along with Patrick Masson, Ken Udas, and Luke Fernandez. Here are the details of the session:

Wednesday Oct 19th, 2011
3:30 PM – 4:20 PM
Meeting Room 102A/B
http://www.educause.edu/E2011/Program/DISC18

This discussion will focus on the emergence and adoption of principles and methods that can help develop and enable open communities of practice. Topics will include the characteristics, attributes, principles, and behaviors that promote open access; open-source software; open content; open educational resources; open courseware; open research; open standards; management practices like open governance; and more.

Patrick sent the following note to the CIO List about the session:

We are planning our Openness CG for the Educause conference and I was hoping to get a bit of input for the meeting. I have asked Luke Fernandez to “interview” Chuck Severance about his new book, “Sakai: Free as in Freedom.”

Luke has posted a very nice review of the book (http://itintheuniversity.blogspot.com/2011/08/review-of-sakai-free-as-in-freedom.html), as have others (Jim Farmer: http://mfeldstein.com/charles-severance%E2%80%99s-%E2%80%9Csakai-free-as-in-freedom%E2%80%9D-a-review/ Free Software magazine, http://www.freesoftwaremagazine.com/columns/book_review_sakai_free_freedom_written_charles_severance#).

Chuck offers several thoughts about how Sakai, as both an open source project and a foundation succeeded, and where he felt it might have missed stepped. Many of these observations are specific to openness, and as Luke highlights from the text, “My opinion was that the purpose of the Foundation was to have a light touch and focus on nurturing the individual and organizational members of the community. The opposing view held by the majority of the board members was that the Foundation and Foundation staff were a form of command and control with the top of the authority hierarchy as the Sakai Foundation Board of Directors. The…stakeholders were concerned that letting individuals….make their own priority decisions….would be too risky for the adopting schools…..Central control and guidance was needed to insure that the product would move forward according to a well-defined and well-understood roadmap and do so on an agreed-to schedule.”

I think this touches on some of the key aspects of not only openness as a value proposition and operational paradigm, but also how organizations can be challenged to adopt, and thus take advantage of openness. In the past the Openness CG’s have been poorly attended. I think this event could provide quite a draw, and I was wondering if those on the list here might be able to help promote our meeting?

I would appreciate it if folks could respond with suggestions, and volunteer to see those suggestions through for promoting the event. Also if you have any topics of interest for Luke to cover with Chuck, it might help us with ensuring we touch on all of the areas of interest in the openness cg. hopefully we can spark a lively and informative discussion.

Thanks,
Patrick

Open Source Copyright Thoughts

A colleague asked me the following question:

For software produced here (UMichigan) that is going to be released under an open source license, who should be listed as the copyright holder in the source code? Would it be “The University of Michigan”?

Here is my answer:

I am not a lawyer but I do a lot of Open Source stuff in many projects. This is what I do and what people in Sakai generally do.

(1) If the project is primarily written by me from scratch, I use Apache 2.0 as the copyright in my own name. Anyone can reuse or include the code I build (including me) because of the license with a simple acknowledgement – with the right license the “owner” does not matter. If someone wants to switch the copyright to a GPL license – the copyright owner can give that permission and one person is easier to ask than to ask a university. The answer may be yes or no with a person – with a university you are most likely never to get any answer. Using your own name is not seen as arrogance unless someone asks if they can change the copyright and you say ‘no’. An example of this is if I build a new experimental part of Sakai I will initially make it copyright me Apache 2.0 – if the code matures and ends up in the core of Sakai I voluntarily change the license to be Copyright the Sakai Foundation using the Educational Community License 2.0 as that is the preferred copyright for Sakai core source and it is considered tacky and arrogant to maintain your name in the copyright of Sakai core code. I emphasize that it is not tacky to use your own name initially. It *is* considered tacky to refuse to allow the name to be changed when asked by a reasonable open source organization / project.

(2) If I am writing in the context of a project like Sakai, Moodle, or ATutor I simply use the copyright of the rest of the code with a clear indication of the extent of my authorship where appropriate. If you say anything about keeping your own name in the copyright in their code base – they will punt you as far as they can. They will kick you out of the project and talk about you behind your back for years to come.

(3) If I am writing a tool for an organization that I am doing consulting for, I insist on the code being copyright them, but Apache 2.0 – so I and others can reuse the code with no further permission except for an acknowledgement. I have not faced a situation where a client insists on me building significant code that is not open source – but frankly I would likely not even be willing to write code for such an organization. I do advise closed-source organizations on how they should go about writing code – I just won’t write their code.

(4) If I am building a tool where multiple UMich authors are involved and in particular paid staff are involved and neither (1), (2) or (3) apply, I copyright the code University of Michigan Apache 2.0.

Interestingly, w.r.t. Sakai, UM has a master agreement that lets them take Michigan contributions and change the copyright to Sakai Foundation. Indiana, and Stanford have similar master agreements. To my knowledge, MIT and Berkeley never executed such agreements and so bits of code still have those universities named in copyright statements in the code. You will not see a Stanford, Indiana or Michigan copyright anywhere in Sakai code – that is considered a badge of honor by the cool tribes. Interestingly, deep in one tool, we use some Stanford code for a concurrent hash map that was built well before the Sakai Project and used in Sakai. We did *not* request change the copyright in that utility code because it was clearly not produced as a contribution to the Sakai project.

All this is evidenced by the Sakai acknowledgements page:

http://nightly2.sakaiproject.org:8082/library/content/gateway/acknowledgments.html

The most awesome and giving schools are in the first list because they are the ones that contributed and transferred copyright or simply contributed with the foundation’s name as copyright in their contributions. They of course are in alphabetical order so as not to characterize any contribution as more important than any of the others.

Probably more than you wanted to know :)

/Chuck

Introducing FOORM – Form Oriented Object Relational Mapper

FOORM Presentation and Persistance

Note: This is extracted from one of my earlier blog posts. FOORM is still a work in progress and while I love using it and think it is the best presentation and persistence framework the world has ever seen – it is still early days and the framework maker must endure using it for a *long* time before accepting any new customers. So this is and will continue to be a work in progress for quite some time with the Sakai support for LTI as my test bed for FOORM.

In FOORM you build a model using an array of strings that look as follows;

  static String[] CONTENT_MODEL = { 
      "id:key", 
      "tool_id:integer:hidden=true",
      "SITE_ID:text:maxlength=99:label=bl_content_site_id:role=admin",
      "title:text:label=bl_content_title:required=true:maxlength=255",
      "frameheight:integer:label=bl_frameheight",
      "newpage:checkbox:label=bl_newpage",
      "debug:checkbox:label=bl_debug",
      "custom:textarea:label=bl_custom:rows=5:cols=25:maxlength=1024",
      "launch:url:hidden=true:maxlength=255",
      "xmlimport:text:hidden=true:maxlength=16384", 
      "created_at:autodate",
      "updated_at:autodate" };

This is kind of like a Rails model in colon-separated form. One thing that FOORM assumes is that you will be internationalizing your UI and so it even encodes the field labels..

You can build an input form with the following lines of code:

String formInput(Object previousData, String [] modelInfo, Object resourceLoader)

This returns a set of input tags, one for each field, repopulated from the previous data (can be Properties, Map<String,Object>, or ResultSet) and using the provided resource loader.

You embed the form tags in some Velocity as follows:

<form action="#toolForm("")" method="post" name="customizeForm" >
    $formInput
    <input type="hidden" name="sakai_csrf_token" value="$sakai_csrf_token" />  
   <input type="submit" accesskey ="s" class="active" name="$doAction" 
        value="$tlang.getString('gen.save')" />
   <input type="submit" accesskey ="x" name="$doCancel" 
        value="$tlang.getString('gen.cancel')" 
        onclick="location = '$sakai_ActionURL.setPanel("Content")';return false;">
</form>

There are utilities to validate incoming data, and extract it into the right objects and type for inserting into a database. with a very few lines of code. Here is an extract from request properties into newMapping and insert into the database:

    HashMap<String, Object> newMapping = new HashMap<String, Object>();

    String errors = foorm.formExtract(requestProperties, formModel, 
        rb, true, newMapping, null);
    if (errors != null) return errors;

    String[] insertInfo = foorm.insertForm(newMapping);
    String makeSql = "INSERT INTO " + table + " ( " + insertInfo[0] + 
          " ) VALUES ( " + insertInfo[1] + " )";
    final Object[] fields = foorm.getInsertObjects(newMapping);
    m_sql.dbInsert(null, sql, fields);

It supports a very REST-style property bag approach to data models except that all the fields are properly and fully realized as database columns so they can be searched, selected, grouped, and ordered with as sophisticated query as you want to create.

FOORM is designed to be extended so you have the generic Foorm and you extend it to make a SakaiFoorm that captures all the Sakai-specific rules for generating fields, handling internationalization, etc.

I even dispensed with the “CREATE TABLE” scripts – since Foorm knows the entire model, it can make the tables and sequences automatically. It know about the various rules for fields in MySql, HSQL, and Oracle. The AutoDDL code looks as follows:

 String[] sqls = foorm.formSqlTable("lti_content", 
     LTIService.CONTENT_MODEL, m_sql.getVendor(), doReset);
        for (String sql : sqls)
          if (m_sql.dbWriteFailQuiet(null, sql, null)) M_log.info(sql);

I could make this more succinct with an overridable method in Foorm. Foorm still needs to learn about changes to data models and automatically generate “ALTER TABLE” commands. I will write that code when I need to do a conversion when I release a future version of DBLTIService is released and I need to expand the model.

I think that Foorm has a lot of potential. It is kind of a weird combination of a partial presentation layer and partial object relational mapper. It is focused on providing useful library utilities that let the developer get as tricky as they want, making sure that the nasty repetitive common tasks take as little effort as possible.

You can ask me “Why not Hibernate?” or “Why not Spring JDBC?” at a bar sometime. Be prepared to hear my voice raise a few notches as I rail about “Hibernate’s is an overbearing over-engineered steaming pile of…(oops did I say that out loud?)” or “Spring JDBC makes SQL portable in Java only to the level that they solve the trivial problems and punt on anything mildly interesting or complex…”. If I have had a few beers before the conversation starts, it will be more (or less) interesting.

Oh by the way, Foorm provides a nice, generic way to handle paged SQL queries. I still need to build advanced WHERE clauses, and support ORDER BY and GROUP BY operations.

This is why Foorm is still a prototype and I don’t want anyone else using it for a while. I feel that any library / framework should be used by its creator for a long time, solving lots of real-world problems before they claim to have anything truly reusable. But I figured I would show folks what I am thinking.

I have been experimenting with finding the right combination of power and convenience for both display layers and persistence. Before FOORM there was OMG-ORM (http://omg-software.com/) – it too was an exploration of the balance between powerful capabilities and intrusiveness as well as trying to make things intuitive. FOORM is much simpler than OMG and I find it more natural to use.

Dave Johnson: A Simple Explanation of Open Licenses (Updated)

I ran across this classic from 2006 and just had to make sure to remember it.

Dave Johnson humorously and insightfully reduces open source licenses to this over-simplified form by creating three classes of licenses: (a) Gimme Credit, (b) Gimme Patches, and (c) Gimme it ALL!

I cannot say it better than Dave Johnson does in this classic 2006 blog post:

http://rollerweblogger.org/roller/entry/gimme_credit_gimme_fixes_gimme

Read the blog comments. The second one in particular that refines the over simplified notion of GPL making it clear that “Gimme All!” only matters when you redistribute.

And actually, with the new Affero GPL, the list should be:

  • Gimme Credit! (Apache, BSD, MIT)
  • Gimme Patches! ( MPL, CDDL, LGPL)
  • Gimme it ALL! if you redistribute. (GPL)
  • Gimme it ALL! (Affero GPL)

That somewhat completes Dave’s list nicely and brings it up to date.

Abstract: Disruptive Innovation (Keynote)

Keynote Speaker: Charles Severance
Wednesday September 28, 2011
European Sakai Conference: http://www.eurosakai.nl

The Sakai effort intended to take the reins of innovation in teaching and learning technology back from the commercial vendors in the marketplace. The success of the Sakai CLE to date is largely because it effectively reflects the dual nature of reaching and collaboration in higher education. Because the Sakai CLE serves these two roles, it has taken some time to get to to the point where the CLE has feature parity with the other Learning Management Systems in the marketplace. With the Sakai 2.9 CLE release, we need to think about what it means to be capable of direct competition with the other vendors in the LMS marketplace. At the same time as our CLE product is maturing, we need to find a way to move out community closer to the “innovation front”. The Sakai OAE is building wonderful new academic collaboration capabilities with a user and content-centered focus. Yet another innovation front is building the teaching and learning technology for self organizing learning activities like Khan Academy as well as providing technology that is a better fit for K12 education. As the Sakai CLE reaches maturity in the marketplace, we need to look at ways to consolidate our gains as well as pursue new areas where the Sakai community can positively influence teaching and learning with technology around the world.

Abstract: Sakai 2.9 , 2.10 and Especially the Portal Innovations

Speaker: Charles Severance
Tuesday September 27, 2011
European Sakai Conference: http://www.eurosakai.nl

The Sakai 2.9 CLE portal has a whole new look and feel focused on a more efficient and engaging user experience in Sakai. This session will present new features of the Sakai CLE 2.9 Portal including the new navigation, site preferences, and Chat/Instant message system. The session will look at both the user experience as well as the technical underpinnings of how the new portal works. We will discuss the remaining CLE 2.9 challenges and issues to be worked on over the summer and through to the code freeze. We will also discuss possible areas for investment for the Sakai 2.10.

Management Versus Leadership

I was collecting thoughts on the notion of the difference between management and leadership in a file on my desktop. I want to clean up my desktop so I will store them here for now.

Leadership appears when it is most needed. Management is inveterate.

Leadership benefits those who are led. Management benefits itself.

Leadership has been and will be part of the human experience for all time. Management is an invention of the industrial revulsion.

Leadership is an attribute of a person. Management is an attribute of a position.

A Colonoscopy Story

I am generally a pretty healthy guy and so I generally don’t spend much time in Doctor’s offices for myself. I get a physical every 10 years whether I need it or not and the results are (thankfully) the same.

A few weeks back, it was time for my first physical (WHERE AGE > 50) (sorry for slipping into SQL there). My doctor went through all the normal things and everything was fine and nothing had changed since my last physical in 2001. That was all great news.

But then we had “the conversation” about how people over 50 need a Coloscopy. Given that I am in pretty good health and have no family history of cancer of any kind let alone colon cancer, I might have had a decent excuse to say “no thanks” and walk away.

I knew this conversation was going to happen and so in the days before, I thought about my good friend and dear colleague Bob Frost.

Memorial Facebook Page
http://www.dr-chuck.com/csev-blog/2011/04/bob-frost-memorial-service/
http://www.dr-chuck.com/csev-blog/2011/03/bob-frost-1952-2011/

As much as I wanted to run away and say “no”, I knew the right answer was “yes”.

So yesterday was my date for the Colonoscopy procedure at the University of Michigan Hospital. I had known other folks who had the procedure at UM Hospital so that made me extra comfortable.

The preparation started Thursday morning. Most folks say the prep is the worst part. This is the prep that we chose:

http://www.med.umich.edu/1libr/aha/ummiralaxprep.htm

I followed the prep almost perfectly. Wednesday night, I had a substantial meal with steak, soup, salad and a baked potato at Buddies in Holt at about 10PM. I wanted my body to have lots of pre-digested protein, fat and carbohydrates in the bloodstream to endure the 1.5 day fast. My last solid food was a Egg McMuffin at 8AM on Thursday.

I had an IMS LTI teleconference at noon to talk about RDF and XML bindings when the prep was supposed to start. I foolishly waited to the last minute to buy my pills and MiraLax and Gatorade. Then meeting started to go long. So I switched to my cell phone and drove to the pharmacy to pick up my materials. There was some detailed discussions going on as I picked up my supplies. I must have sounded pretty weird to folks in the other aisles at CVS when I started addressing the availability of RDF libraries for PHP. To be courteous, I dropped off the call to check out and then rushed back home to rejoin the call. I took the pills at 1PM (one hour late) in my car on the way home to get the procedure started. I got home in time to rejoin and finish the meeting from my home office. It was a great meeting and a lot was accomplished – I was glad I did not miss it.

At about 3PM I started drinking the MiraLax/Gatorade mix. The procedure said to wait until “something happened” or 6PM – but nothing “happened” and I did not want to fall behind schedule so I started chugging the 9X daily dose of MiraLax between 3PM and 5PM. By 6PM, still nothing was “happening”. I was starting to feel like a failure. But by 6:30PM, I was relieved that things started to “happen”. After that, they happened about every 30 minutes pretty much like clockwork.

For me, there was never any bloating, cramping, or discomfort. About every 30 minutes I would begin to “feel the need” and – would take care of business, and then go lay down and watch TV. I was not tired or uncomfortable at any point. But I was distracted enough to screw up and miss an IEEE Computer Editorial board teleconference at 4PM (Sorry Ron and Judith). I could have made the meeting – I just forgot because I was too focused at 4PM and worried that nothing was happening and I was falling behind schedule – and I hate to fall behind schedule.

By 9PM things had been progressing very comfortably and regularly every 30 minutes. I had to make a run to pick up some football tickets for a Friday night game from my friend John Liskey. He was at a bar about 20 minutes away from my house. So I waited until “something happened” and then rushed to the bar. There was construction on the route I took so it took longer than I had anticipated (uh-oh). At 9PM on a Thursday they were resurfacing Cedar Street and in a pipelined fashion, they were grinding off a lane, cleaning it, and laying down new asphalt and then finishing the asphalt – all in a amazing dance that took about 500 yards of machinery. It was beautiful engineering – but my mind was elsewhere as the traffic crawled along. I got through traffic and made it to the bar before I was completely uncomfortable – I rushed in, waved at John, and then ran to the restroom. I stayed at the bar for 30 minutes, and when it happened again, I knew I had 30 minutes to get home so I quickly said my good-byes and took a much quicker route back home.

By 10:30 things were slowing down to be more like once per hour. So I decided to try to get some sleep. I slept for a while and then needed to get up and then went back to sleep. It was OK – kind of like a west-coast redeye flight. I got enough sleep to trick my body into not being tired the next day. At 4AM, I started the next 9 doses of MiraLax plus Gatorade and pretty quickly we were back on the every 30 minutes schedule. By 5AM I desired to stop trying to go back to sleep and started to catch up on E-Mail in between sittings.

I was a little concerned with the “output”. Throughout the night, there were less and less solids and at about 6AM there were no solids at all. But the color was brownish and still not fully clear as specified in the procedure. Again I started to feel like I was behind schedule and might end up failing.

My “last allowed liquid” time was 9AM so I savored a cup of good black coffee (Loving my Bunn Velocity Brew Coffee maker) with a bit of sugar – but no cream. I did not want to end up with a caffeine headache during the day. By 9AM, the mean time between sittings was creeping back up to 45 minutes and the “output” was increasingly light, but it was not “clear and yellow” per the prep specifications. I was quite worried. At 9AM I felt confident enough to do some lifting and hauling and did an errand that took me about 45 minutes – no problem at all – by then the pace was slowing.

At 10 AM it was time to take the 1 hour drive to Ann Arbor and Michigan Hospital. So off we went. About 45 minutes into the drive, I started to get “the feeling” and would have been most happy if we stopped. But since we were running late, stopping was not in the cards. It was not really uncomfortable but I was really happy when we arrived at the hospital. And doubly happy that the hospital has restrooms as soon as you walk out of the parking ramp and into the hospital.

And the awesome news was that the “output” was perfect! It met all the prep specifications and then some. I was so happy – it was so pretty. I should have taken a picture and tweeted it! All the stuff I had read suggested a lemonade-color. But this was more of a pretty deep yellow color – more like egg-drop soup without the eggs. To me, it was so pretty because the last thing I wanted to do was disappoint my doctors and I certainly did not want to have to come back and try again another week. School starts Tuesday and it is hard to take two days off while teaching.

So we rushed to the Medical Procedures Unit five minutes late and checked in. I have been vague-tweeting because I did not want folks to get concerned. When they called my name, I made my last tweet and then gave my phone to Teresa and in I go to get my gown and lay down on my side.

Inside the unit, I changed into my gown and went to the bathroom one more time – and yes it was that awesome clear yellow color. I had achieved the specifications of the prep only 60 minutes before my appointment – but hey – it was on time and on budget – so I was very pleased.

There was a really nice clinician who wanted to ask me some survey questions for a study they were doing. When she got to the privacy part, she said, “The worst case that might happen is that if one of our servers were hacked, someone might figure out that you had a Colonoscopy.” I told her I was not too worried about privacy since I had tweeted that I was having one about 15 minutes earlier. It was a cool survey and I told her I wished I had the “as-built” data to inform my prep. She laughed. I told her that during the prep, I really wanted to do it right and I spent a lot of time on Google trying to find anything that might give me some kind of summary of what should happen when as you do the colonoscopy preparation. They were great telling me what to do – but not so good telling me what to expect and how the prep would progress. I also said that their timing was perfect – although I might have wished it finished 2 hours earlier. If I do this next time, I might start the process at 11AM. But of course, because of the IMS LTI meeting I did not start until 1PM – so I *was* an hour late starting – so the schedule slip was my fault (Mark, Greg, Lance, and Colin contributed to the schedule slip unknowingly).

Once I finished the survey, it was time for the IV – this was the scariest part for me. I have a weird relationship with needles. I can handle the pain but somehow I am scared of them (except of course when I am getting a tattoo). One time many years ago, I fainted after I stood up having blood drawn. So since then I tell anyone about to put a needle into me that I am a wuss. I think that by me telling them that I am a little scared makes it OK for me because since I started telling them I was scared I have had no problems. But my veins are too small and it took her four tries to get me locked and loaded. Gaak. It was not painful but more nerve wracking than anything else. But the clinician kept asking me questions about the prep and it distracted me nicely. Now that I think about it – maybe she was not doing a survey at all – she was just they’re to distract me – Hmmm. Whatever. It worked. And we had the IV.

The nurse asked me all the stock questions and that went fine. Then we were went over all the risks and that was fine. Then we got to the part where I was going to sign it all and I had a little concern. I felt uncomfortable signing a document that said we were going to look in and remove whatever polyps they would find so they could do a biopsy. The one other person I knew that had this done – they found one polyp and removed it.

What concerned me was that I was signing a “blank slate”. For me, taking a quick look and removing 1-2 polyps for diagnostic purposes was OK and understandable. But what if they found more? How many would they take out? It seemed to me that at some point when you go from 1-2 removals to 10+ removals (would be very rare) we are switching from a low-risk diagnostic procedure to a small major surgery. And frankly, I wanted to understand what is going on if there was to be a major surgery – I might want a second opinion or something and I wanted to be emotionally prepared if there was something major. What made me most uncomfortable was the sense that they would put me out and then I woke up they would tell me if I had a routine diagnostic procedure or a (what seemed to me to be) small major surgery. I was not ready to let that decision be made while I was under anesthesia. That was my decision to make and I wanted to be awake and presented with the evidence so I could participate in that choice.

This was the first time they joked and mentioned the phrase “Doctor’s make the worst patients.” I went through the story of my concerns four times. First I explained it to the clinician doing the survey and she told me horror stories of colon cancer. I then told her the story of Bob Frost and told her I was properly respectful of colon cancer but wanted to be part of any major surgery decisions. The clinician was nice about it. Then the receiving nurse came in and we had the same conversation. She just said, lets sign the consent form as is and put a sticky note on it. I agreed.

Then the nurse supervisor came in and we did the story again. She was somewhat stern but understood. I am sure that the problem is that if they need to bring the doctor out to talk to me, it messes up their timing a lot. I understand efficiency concerns but I was not gong to agree to something I did not understand. So I laid there for what must have been 45 minutes all prepped and ready to go telling this story to folks over and over again. I watched others being wheeled in ahead of me whilst I waited (all the while with no laptop or iPhone to pass the time).

Since I was insisting on discussing the procedure – I spent more time waiting. Laying in the hospital bed looking out the window with nothing to do, I spent a lot of time thinking about Brent, his surgeries, and the fact that he is working through some tough issues.

Finally the doctor came in. She was awesome. I explained my concerns about being comfortable with removing a polyp or two for a biopsy and not comfortable with declaring “open season” if there were a small army of polyps when they got in there. She assured me that if there were more than 1-2 polyps that she would stop at 2 – so I agreed and we tore up the sticky note and were good to go.

In a few more minutes they wheeled me into the room with the nurses and the doctor and I rolled over on my left side. Everyone was joking, light-hearted and laughing. They were amazed that I scheduled a colonoscopy on my birthday. I told them it was my birthday present to myself. I told them that if it was my fiftieth birthday it would show how much I wanted the colonoscopy. But instead I told them that it took four years before my physician “caught up with me”. I also told them that I needed to get this done before classes started at University of Michigan on Tuesday September 6.

The put the drugs in my IV and in about 15 seconds, I said – “Things are a little out of focus” and then 5 seconds later I gently went to sleep. The next thing I know I am back awake with no discomfort at all – I mean zero – no discomfort. I am looking at a computer monitor showing video from the camera. I don’t know it it was video being replayed or it was live – but I was seeing them pulling the camera slowly back out of my intestine. It was so awesome – I wish I could have had the whole video – you could see the folds and chambers of the inside of my colon – it was the coolest thing.

It turned out that I had one polyp (completely normal) and they took it out to examine (completely normal). They wheeled me into recovery where Teresa came in. They gave me some pictures of the polyp and the test results. I got my iPhone back and tweeted post-procedure with a picture of my scary IV that was still in. In about 20 minutes I was ready to get dressed and leave. There was no lingering pain no bloating – absolutely no abnormal feeling at all. It was literally as if nothing had happened.

So we left (I of course was not driving) and went to have lunch at the Caribbean Jerk Pit for lunch. Since I had just had a Colonoscopy I backed off and only had medium spiciness. I figured to take it easy. Because of the timing of the steak meal and Egg McMuffin, iI never experienced any hunger. I am a bit weird because I skip meals all the time and don’t feel hungry – so your mileage may vary on the hunger thing.

One the way back to Lansing we stopped at the Secretary of State office to renew my plates (it is my birthday remember) since I had waited to the last minute for that as well. Then we went to Culvers for some custard.

Then we went home and changed and went to the football game at 7:30PM. I would point out that after I went to the bathroom about 11:30 AM right before I went into the procedure, I had no more urges to go to the bathroom at all. The timing and dosage of the MiraLax cocktail was perfect.

I next went to the bathroom normally on Saturday morning (20 hours after the colonoscopy) and everything was normal – no pain, no bloating, no urge, no nothing.

All in all, it was an amazing experience and I am nothing but impressed with the medical professionals at the UM Hospital who do this procedure. And it is an awesomely good feeling to know that my colon is as healthy as the rest of me. And I certainly have Bob Frost to thank to make sure I had the backbone to actually do this and not delay it.

Obviously if you are looking at this considering your own procedure, all situations are different. Other than being a bit chubby and sitting in front of a laptop computer too much, I am in excellent health and this was a *diagnostic* procedure as part of a physical exam. If you have health issues or are on medication you may have a quite different experience. I guess what I would suggest is not to be worried about a Colonoscopy procedure at all if you are in good health.

I hope that if you have read this far, (a) you are not too grossed out, and (b) you have found this helpful.

Comments and suggestions welcome.

The Funniest Wikipedia Article I Have Ever Seen: Outcomes Based Education

Perhaps my sense of humor is completely warped. Perhaps I am just a mean and hateful person looking for the negative in everything. But I found the following WikiPedia article totally funny. It is really the first example I have seen where WikiPedia contains content while appearing to be factual has a cleverly encoded contrarian perspective hidden inside.

http://en.wikipedia.org/wiki/Outcome-based_education

As best I can tell, this is a classic Marc Anthony, “I came to bury Caesar, not to praise him.” speech. While appearing to be factual and neutral, the underlying theme of this writing is that Outcomes Based Education taken to excess is a mistake. But the “this idea is overhyped” message is cleverly wrapped in highfalutin passive voice academic writing to the point that nearly masks the intent of the writer.

I think that many of the sentences in these paragraphs are dripping with sarcasm. The real message is just barely visible under the surface of the prose. I copy the verbatim text from Wikipedia here in the off chance that someone will edit the article to remove the delightful, deadpan sarcasm.

Outcome-based education (OBE) is a recurring education reform model. It is a student-centered learning philosophy that focuses on empirically measuring student performance, which are called outcomes. OBE contrasts with traditional education, which primarily focuses on the resources that are available to the student, which are called inputs. While OBE implementations often incorporate a host of many progressive pedagogical models and ideas, such as reform mathematics, block scheduling, project-based learning and whole language reading, OBE in itself does not specify or require any particular style of teaching or learning. Instead, it requires that students demonstrate that they have learned the required skills and content. However in practice, OBE generally promotes curricula and assessment based on constructivist methods and discourages traditional education approaches based on direct instruction of facts and standard methods. Though it is claimed the focus is not on “inputs”, OBE generally is used to justify increased funding requirements, increased graduation and testing requirements, and additional preparation, homework, and continuing education time spent by students, parents and teachers in supporting learning.

Each independent education agency specifies its own outcomes and its own methods of measuring student achievement according to those outcomes. The results of these measurements can be used for different purposes. For example, one agency may use the information to determine how well the overall education system is performing, and another may use its assessments to determine whether an individual student has learned required material.

Outcome-based methods have been adopted for large numbers of students in several countries. In the United States, the Texas Assessment of Academic Skills started in 1991. In Australia, implementation of OBE in Western Australia was widely criticised by parents and teachers and was mostly dropped in January 2007. In South Africa, OBE was dropped in mid 2010. OBE was also used on a large scale in Hong Kong. On a smaller scale, some OBE practices, such as not passing a student who does not know the required material, have been used by individual teachers around the world for centuries.

OBE was a popular term in the United States during the 1980s and early 1990s. It is also called mastery education, performance-based education, and other names.

Yeah, what might be some of those other names that might be used to describe OBE?

Yes, I know – I am just like those old guys on the Muppet Show – always finding something to be grumpy about :).

P.S. I am not anti-Outcomes based education – i think good teaching draws from a lot of techniques. There is no single silver bullet that “solves teaching”. All too often experts pick one technique and then run around like Mario with his hammer in Donkey Kong just banging the same hammer over and over everywhere they go. Real teaching is very dynamic and requires a good teacher to use the right technique at the right time.