IMS Basic LTI Support in ATutor (Prototype)

I have been talking with Greg Gay of the ATutor (www.atutor.ca) project about adding IMS Basic LTI to ATutor. ATutor already has a very solid certified implementation of IMS Common Cartridge 1.0 and so I wanted to see them with the Basic LTI certification as well and prepare them for IMS Common Cartridge 1.1 that includes Basic LTI.

I started working in earnest Sunday morning (coding helps me recover from jetlag) and finished it by Monday evening. Working in ATutor was quite nice – the module extensions are well documented and Greg and Cindy were quick with help any time I got stuck. The content system is less pluggable than the module system so my changes there are less elegant.

Here is a demonstration of the resulting work showing Wimba and the IMS Basic LTI test harness working in ATutor:

IMS Basic Learning Tools Interoperability in ATutor (Prototype) from Charles Severance on Vimeo.

It is always fun to experiment with and learn about a new architecture and approach to building an LMS, having now worked on Sakai, Moodle, and OLAT. I am in particular curious as to how to best build PHP applications and so it is nice to look at how a mature project views plug ins and how it implements Model-View-Controller.

The code consists of a module that primarily provides a new administrator capability called “Proxy Tools” to create instances of IMS Basic LTI Tools. I took this design approach from the Moodle/Blackboard approach rather than the Sakai approach but adapting it a bit to more naturally align with Full LTI when it comes out.

The other portion of the code is an extension to the ATutor content system. The content system is not designed for extensions – so I have patches scattered across ATutor. I am sure that when Greg takes a look at it, he will have some good suggestions as to how it can be improved structure-wise. I was in a hurry so I made things work.

Getting the Software

Source Code to be checked out into /atutor/docs/mods/basiclti

There is a README file with instructions.

You first check the code out into the module directory (above) and then apply the included patches to the rest of the ATutor (trunk) distribution to make the changes to the content system.

After that it is module installation and making some IMS LTI Tools and adding content and testing. The video (above) is a good test plan and shows what it should look like when everything is installed.

Next Steps

This is really only prototype code – written quickly. My next steps are to talk to Greg and the ATutor team to figure out how to do this properly in the ATutor code base in a way that is more maintainable.

It is always fun making prototypes – but then the hard work really begins.

ORM Experiment

I was creating some new tables and decided to build a simple ORM/form framework to automate common tasks of displaying displaying create/edit/view forms and INSERT/UPDATE SQL statements. I really like how this one turned out – it is the simplest ORM/FORM system I have ever built. It is here:

Dr. Chuck’s new FORM/ORM Experiment

I need to do a bit of re-factor to make the HTML generation more pluggable and separate the portable and reusable bits from the ATutor specfic bits. But it was good fun to write and saved me from lots of repeated cutting and pasting to make forms and avoided making classes for objects. It kept my re-typing of the same thing over and over again to a minimum and made my code easier to debug and validate.

Jasig and Sakai Foundations To Pursue Merger

[This is simply the text of the joint announcement with no editing or commentary — Chuck]

Early this year, Jasig, the parent organization for uPortal, CAS, Bedework and other open source software serving higher education, and the Sakai Foundation, which supports the Sakai Collaboration and Learning Environment, formed Board-level groups to examine ways the two organizations could collaborate more closely. These Strategic Alliance Committees, led by Jasig Chair Aaron Godert, and Sakai Foundation Chair Josh Baron, met in New York in September to consolidate the outcomes of their discussions and bring proposals to their respective Boards.

Based on the recommendation of both Strategic Alliance Committees, the Jasig and Sakai Foundation Boards of Directors are today, October 7, 2010, jointly announcing an intention to pursue a merger of their two organizations, subject to the approval of their respective communities. The new entity would foster the development and use of open source software that supports the academic mission. This goal would be achieved through the identification and promotion of related best practices for increasing effectiveness, efficiency, and innovation in academic institutions, while lowering the risk of the development and adoption of open source software. It would support the further development of communities of interest and practice to explore the use of open source systems and tools to support teaching, learning, research, and other aspects of the academic enterprise. These communities of interest will strongly inform future software development effort.

“Through the discussions over the past several months it has become clear to me that Sakai and Jasig share, at a strategic as well as operational level, much in common”, said Josh Baron. “I am excited by the coming together of our organizations, which I see as a natural step in the evolution of higher education open source initiatives, as it will facilitate many new and exciting opportunities across the academy.”

The new organization would provide shared infrastructure, expertise, and other resources for the development of a wide range of open source software projects that are designed to meet the needs of academic institutions. It would conduct outreach and explore more meaningful relationships with a broad range of open source and standards organizations. It would provide a clearinghouse for best practices on related management issues such as crafting open source-friendly procurement processes and assessing the adoption risks of open source software.

The Jasig and Sakai Foundation Boards of Directors share the conviction that the values of openness, transparency, and meritocracy that underpin successful open source projects are profoundly congruent with academic values. We believe that, beyond the benefits that the software itself can provide, fostering the skills and experience necessary to manage the development and implementation itself will strengthen both the commitment and the effectiveness with which our respective institutions uphold these values across the entire range of academic endeavors.

“As we look to position our products and communities for continued and sustainable success into the future, the opportunity for Jasig and Sakai to join efforts and create a more robust network of open source innovation and community engagement, is one I am enthusiastic about,” said Aaron Godert.

Moving forward, the two Strategic Alliance committees will continue their due diligence regarding the mechanics and practicalities of a merger and will seek ongoing input from their communities. They will craft a detailed proposal to be approved by the respective Boards of Directors and then voted on by their respective communities in the coming months and will be regularly communicating as this work progresses.

About Sakai
The Sakai Community develops and distributes the open-source Sakai Collaborative Learning Environment, an enterprise-ready collaboration and courseware management platform that provides users with a suite of learning, portfolio, library and project tools. Sakai collaborators – ranging from educators to engineers – share in their successes and challenges, honing the community’s collective expertise to drive rapid development of this enterprise-ready platform. Sakai is distributed as free and open source software under the Educational Community License. Sakai is an open source software project driven by the Sakai Foundation, a worldwide consortium of institutions, organizations, and individuals dedicated to providing collaboration, research, and e-portfolio tools. The Sakai Foundation is a nonprofit organization that is dedicated to coordinating activities around Sakai and the Sakai community to insure Sakai’s long-term viability. For more information, please visit www.sakaiproject.org.

About Jasig
Jasig is a global consortium of educational institutions and commercial affiliates sponsoring free and open source software projects for higher education. Jasig is a member supported, non-profit corporation aiming to attract, advance, and sustain communities developing enterprise-level, open source software that helps institutions fulfill their goals. Jasig connects people, provides infrastructure, and sponsors events that foster innovation and collaboration. Jasig’s flagship projects include uPortal, an enterprise portal; CAS, the Central Authentication Service used for single sign-on and secure, proxied authentication; and Bedework, an enterprise calendar used for public events and personal and group calendaring. Jasig also manages a Project Incubator designed to mentor new open source projects and sponsors communities of practice, such as The 2-3-98 Project, which aims to help institutions understand how to exploit open source. For more information, visit the Jasig website at www.jasig.org.

Police Officer Makes Traffic Stop – Causes Five Cars to Crash – Chelsea, MI

I am publishing this account in hopes that police departments will think about ways to do traffic stops more safely without endangering other motorists.

What I am about to describe happened between 7:20 AM and 7:25 AM on Eastbound I-94 near Chelsea, MI on September 10, 2010. I will first describe the incident as I observed two accidents with a total of five cars caused by the officer making the traffic stop who made a very bad decision as to when and how he made his traffic stop.

I give two accounts of the incidents, the first is the timeline of my experience and observations as the events unfolded in front of me and the second is a reconstruction of a coordinated timeline of the sequence of events as best I can piece together. The incident occurred at the I-94 / US-12 exit, one exit east of the Kalmbach Road exit on I-94 near Chelsea, MI. Here is a map of the area of the accidents. The map below represents bout a mile of I-94.

At 7:20AM on September 10, the sun is perfectly lined up with eastbound I-94 and has just cleared the horizon so it is fully visible and directly over the road as you drive Eastbound. The day was perfectly clear so the driving was like looking into an arc-welder. It is a pretty unsafe situation but the traffic levels were heavy and moving smoothly right about the speed limit.

As I was coming around a curve at A on the map above, I was turning from shade directly into the sun. As I was emerging from the trees (see the tree pattern on the map), I saw a bunch of tail lights and excited lane changes including some trucks moving into the left lane and cars scattering a bit. But everyone was slowing down and finding their place in the left lane safely. Once we all got into the left lane, we could see a motorcycle policeman had pulled over a semi truck right at the beginning of the exit to US-12 (B). All of the lane changes were folks following the law that requires us to give a stopped policeman a full lane. We had less than 500 yards because of the bend in the road and the fact that there was a truck in the right lane that blocked the view of the stopped policeman there was a bit of a scramble for us all to get in the left lane – but most drivers are pretty skilled at 7:30 and we did pretty well.

As we came upon the traffic stop, it appeared that the stop had just started, the officer looked like he had just gotten off the bike to walk to the front of the trunk.

As we passed the police officer and stopped truck at (B), traffic started to go back into two lanes and gently sped back up – at this point the sun was directly in our eyes. In about 15 seconds, as I was passing over the bridge at (C) we again saw furious lane changing, brake lights and cars darting onto the shoulder as traffic ground to a halt. I came to a complete stop about 100 yards past the bridge, right at (C). We had to brake firmly to stop in time but there were no screeching of tires.

I was stopped for about 15 seconds and the line of stopped cars behind me grew at a stopped car every 2 seconds while I watched in my rear view mirror. After there were about 5 cars stopped behind me, I noticed a small car moving quite fast and spinning on the road – they clearly had locked up their brakes and were spinning around like at top at 45 miles per hour. At some point they crashed into one of the stopped cars behind me. The crash bent some sheet metal but it appeared no one was hurt.

11-09-10_093854_01.jpgAfter another 30 seconds, the traffic in front of me started to move. Three lanes of traffic (including the on-ramp from US-12) was squeezing down to one lane and driving on the shoulder to get past three wrecked cars at (D). It looked like the wreck was less than five minutes old and from the debris pattern it looked like there had been contact in all three lanes including the on-ramp. I took a picture of the furthest eastbound car as I went by at about 5 miles per hour on the right shoulder. I only took a picture after I had completely cleared the confusion of cars moving and merging.

After about 2 minutes we got by the last car and I was on my way and continued to work, thankful that I was unhurt and unscratched and knowing at least five people we having a bad day – especially the car that was in the median, quite banged up. It looked again thankfully like there were no major injuries but some quite damaged vehicles and a lot of freaked out people.

Reconstruction

This is my best guess at a timeline for all the events that happened over a 5 minute period. These are educated guesses that fit my experience and the elapsed time as I experienced and perceived it.

07:18:00
I am 2 miles west of the US-12 exit, the police officer is pulling over the semi-truck.

07:19:00
The stop causes a lot of traffic confusion at B because of the combination of drivers going from shade to sun in their eyes, the fact that traffic was trying to get into the left lane with about 500 yards of “visual warning” because the stop was right after a pretty blind curve. Because there are trucks in the traffic, merging to one lane in 500 yards at 70MPH is doubly difficult.

07:20:00
The leading edge of the confused traffic has reached (D) and the problem of getting back into the correct lane is complicated by the intense sun further amplified because the road is rising slightly at (D) making appear as though you are driving directly into the sun, and the fact that traffic is merging from US-12. The incoming traffic on the exit is also in the shade until they merge onto I-94 – at which time they are immediately staring into the sun (see the tree pattern on the map). People are trying to reaccelerate to normal speed but the reduced visibility results in lots of confused brake hits. At some point, this results in three-car accident, likely triggered by a high-speed vehicle unaware that traffic is in a confused state and visibility is limited entering from the on-ramp merging into slower and confused traffic on the highway causing a cascade of panic lane changes that wrecked three cars.

07:20:30
At this moment, I am at (A) and encountering the back-end of the confused traffic that is unaware of the accident at (D) so it tries to reaccelerate at (B), within 30 seconds, the traffic stopped by the accident at (D) causes me to stop at (C)

07:21:00
There are five cars behind me and the two-car spinout accident happens behind me. This accident is almost inevitable as the column of stopped cars grows toward the (B) where there is little forward visibility and the distraction of rushed lane changes triggered by the traffic stop. There may well have been several more accidents I was unaware of between C and B as the stopped car column grew toward westward toward (A) where forward visibility was more reasonable.

11-09-10_093854_01.jpg07:23:00
Traffic starts to move, squeezing three lanes of traffic onto the far right shoulder.

07:24:00
I pass the cars at (D) and take the picture of the easternmost car in the median with radiator steaming and then I am back on my way to work.

Conclusion

The simple, obvious conclusion is that a police officer stopped a truck during morning rush hour and within 5 minutes caused a three-car pileup and a two-car accident.

What went wrong and how could this be prevented? First avoid traffic stops in rush hour traffic unless there is a very severe issue. In a sense since most of the drivers on I-94 at 7:20AM are likely there five days per week, traffic moves quickly and smoothly and efficiently – but with all of the cars and trucks with skilled drivers, the traffic density is relatively high for the speeds.

Usually this works – but with little margin for error, little perturbations can cause problems easily.

Also, officers must take the sun into account for eastbound morning commutes. For the 30 minute period when the sun is directly in the eyes of drivers – avoid anything out of the ordinary. Sun directly in the eyes reduces a drivers reaction time and awareness – if they are squinting or peering from under a sun visor – they are not fully aware of their surroundings.

In a sense, sun in the eyes is like an icy road. It is probably a good idea to avoid traffic stops on slippery roads during rush hour as a general principle – but police should be aware that sun is a very big effect when it is at the wrong angle.

Sadly if there were just a bit of clouds on the horizon this would have been averted.

If I were one of the five drivers, I would attempt to sue the State Police for negligence. There is absolutely no question that the police officer was the proximate cause of all five accidents – whether that would translate to liability is anther more complex legal question.

In my mind it is less about punishing the officer – because he had no intent to cause harm – he was just making a traffic stop and doing his job. The value of lawsuit would be that future officers would think more carefully about the safety implications of traffic stops during rush hour traffic.

It is likely if he made his traffic stop 30 minutes earlier or 30 minutes later of if there were clouds on the horizon (which there usually are), there would have been no incident. Police officer training should point out that direct sun into traffic at rush hour is as much a hazard as patches of ice on the road and should be part of the safety implications they consider when deciding to make a traffic stop.

Remedial Math at LCC – What is the way forward?

I know very little about the internal structure and politics in the Lansing Community College (LCC) Math department so I apologize in advance if I miss the boat in my analysis. I am not intending to criticise teachers or advisors as I know how hard their job is and I know that it takes dedication to do their work.

But, it does seem that the remedial program at LCC is very badly designed and managed, resulting in lost revenue to LCC and far worse in my mind, lost souls in society. What the math department is currently doing is so harmful that someone needs needs to step in and clean things up. I suggest a simple fix in the short-term and then some deeper analysis to fine-tune things.

One simple solution is to make a remedial course below MTH050 – perhaps it is not even on the books as a college course. Perhaps it is not eligible for financial aid so you don’t run afoul of the federal government using Pell grants to teach middle-school math. Perhaps it is not even 15 weeks long. If funds are an issue, lets make a foundation and come up with a way for folks to give a charitable contribution to help those who cannot afford tuition get the remedial training that they need – I would write a check for $1000 tomorrow morning to support such a program. Lets write a grant and call it an experiment to get funding for a while.

By the way, one very clever idea in the boot camp program design was that one requirement to qualify for the boot camp was successful completion of some non-math LCC courses. That requirement is a great way to make sure students have demonstrated some basic study and maturity skills before they get to take this remedial class or apply for a scholarship to cover the remedial class tuition.

It would be perfectly fine to demand a certain reading level to take the remedial math class – because you have sufficient remedial reading courses. In some ways, it is good design for the path into remedial math to lead through reading for the weakest students.

My second choice as a solution is to simply let students into MTH050 regardless of ACCUPLACER score and let them take it repeatedly if need be. As I look at the material in MTH050 – it seems very well structured and has a reasonable pace. I do think some students would need to take it twice – weaker students would probably start to fall behind around mid-term time – but that is OK. Perhaps you could allow “visitor non credit” registration for MTH050 for marginal students – so your “fail rate” does not look bad when you look at the numbers. Marginal students are given the option to attend MTH050 and they can qualify for the course (perhaps even by the midterm) by doing well in the course material.

I would bet that before you used the ACCUPLACER, failure rates in MTH050 were higher than they are after ACCUPLACER was in place. Of course – ACCUPLACER simply filters out the students who need the course the most so you only teach the students who barely need MTH050. So failure rates go down. Yay! Kind of a hollow victory.

What you fail to understand when you focus on pass/fail rate for a single semester as your only important metric in isolation, is that you completely miss the fact that sometimes a lot of learning happens even when a student gets a failing grade. Are you interested in more and better learning or are you interested in “better values for the numbers you happen to track”? For one of my previous rants about “numbers-obsessed high-school administrators” – click here.

A far more important question for you to measure success is whether students who fail once eventually pass, and do those students eventually find solid success in later math classes. Of the people I personally have informally talked to as they went through the LCC math program over the years, many of them took MTH050 more than once to build a solid base of math understanding that led to great success later in their college careers. But these people who you positively affected in an amazing way would make your pass rate trend toward 50% because they needed to take the class twice – so your learning outcomes are great – and your pass/fail numbers are not-so-good.

I bet it would not take much looking at all to find a bunch of people who failed MTH050 once and then took it again and passed and then went on to an amazing college career including advanced degrees. You could be satisfied in knowing that you really were part of an important transformative moment in those people’s lives.

And yes, students complain all the time that putting them in courses that they pay for and fail, is a trick that colleges use to to increase revenue.

So all in all there are many solutions and no one will be happy with any of the solutions. Students and parents will always complain. So instead of looking for a way to eliminate student complaints by kicking the marginal students to the curb – instead look for a set of policies that results in the best possible learning outcomes to the broadest population of students.

Even though I despise the design approach of ACCUPLACER test (separate post someday), it is not really the fault of the test. It is the fault of the LCC Math department policy makers that alter the test from a placement test which you cannot fail to an entrance exam which can be failed.

Ultimately, you need to find a way to take the results of the placement test and use them to place students – not use them to reject students and give them no other options.

P.S. In talking with one of my higher education pals about the previous post, he pointed out that I (and we as society) are expecting community colleges to compensate for a failing High School system that seems unable to teach math consistently. Furthermore, schools like Michigan State and University of Michigan hide behind the ACT and SAT to make sure that the students they admit are filtered for success. In a sense, UM and MSU kicks these marginal students to the curb even before they get on campus. At least LCC lets them in the building and lets them take the test and go to the math lab for help if they have the energy and drive to do so.

So is it unfair to be upset with LCC and Community Colleges in general when they do not solve these hard problems that seemingly no other educational institution in our country is capable of handling? Yes – of course it is unfair. But that is what is so wonderful about community colleges – they truly do take everyone and they give an amazing teaching value for the money spent – and they bend over backwards to bring students up to speed. Community colleges are the one place where a confused, poor, or marginally skilled student can come and clearly be cared for and be given a real chance for a future.

LCC already embraces the role of solving the hardest education problems in our society – and does an amazing job in every way that I have observed in all of my interactions with LCC over the past 35 years as a student, faculty member, spouse, and parent.

That is for everything except for math placement. It seems like something that is easy enough to fix – please find where or who the sticking point is and fix it. And when you come up with a solution, make sure to look closely at your your own data that suggests that the best way for students to learn math is when there is real human contact with a teacher and make sure your solution for the weakest math students includes classrooms and teachers instead of online and self-paced approaches which your own data proves is a strategy for failure.

Thanks for listening.

Remedial Math a Roadblock to Education – Teaching Methods Matter

The Lansing State Journal recently ran an article titled “Math a roadblock for many in a quest to further education” that talked about how Lansing Community College (www.lcc.edu) is approaching Math Education – a topic near and dear to my heart. The article featured some interesting data on the relative success of teaching math in lecture format versus more self-paced options.

But the biggest theme in the article was how success in math is a necessary pre-requisite to success in college – and for many students – they get “so close” and the only remaining hurdle is Math – and when those students fail at math – they simply drop out and quit.

One of my favorite quotes from the article is

“Remedial math is emerging as the roadblock that prevents many students from earning degrees or transferring to a four-year school. The cost can be calculated, not just in tuition dollars, but in degrees left unfinished and careers that never begin.”

What the article glossed over is the use of the ACCUPLACER placement test at LCC and how that impacts student success. LCC uses the ACCUPLACER test to determine if you can get into their remedial math course (MTH 050). To get into remedial math, you must take a test without benefit of a calculator, and demonstrate solid mastery of multiplication tables, finding the common denominator, skillfully converting mixed numbers to fractions, conversion between percents and mxied numbers, and many other basic math skills. Let me say that again without benefit of a calculator.

We are talking about 19-20 year old kids who were supposed to learn multiplication tables in 4th grade for a few weeks and then spent the next 8 years with a calculator in their hand for every math class they took – and yet those students who have long-forgotten those 4th-grade skills (many students who have a basic understanding of algebra, geometry, and even calculus) literally cannot even take the first Math class at LCC until they go back and learn the skills of long-division by hand.

This suggests a really basic failure in the design of the ACCUPLACER exam and/or a failure on the part of LCC to provide remedial training on the skills needed to succeed. If multiplication table memorization is critical to student success at LCC – then why not have a course in multiplication memorization. Other LCC placement tests for Reading and Writing always place you in a course so you can start. Math is the only LCC placement course that you can “fail” and be washed out.

As a personal example, my son Brent took the ACCUPLACER writing test (which is not even a writing test at all) and placed into the second-lowest writing class which he promptly got a 4.0 in, and then took the next up writing class which he promptly got a 4.0 in, and this semester he will be taking college level writing. I am actually perfectly happy he got this kind of remedial education as it make absolutely sure he would be ready for college level writing. When he took the ACCUPLACER reading exam – he placed into the lowest possible class (but there was a class) – he took the class and it was way below his skill level – but whatever – he took the course, got a 4.0 in it and he is off and running into other courses. A little review never hurt.

But for math the ACCUPLACER says “you cannot go to college until you learn multiplication tables” – period. It is no wonder that many 19-year olds walk out and say “f*ck that!” and give up on college. They found their way to college, found their way through getting financial aid, got registered, arranged transportation, found the cafeteria, and yet, they are told that “college is not for them” by a f*cking computer program. And a computer program for which there is no negotiation – the advisors will never override the program – the student simply needs to go home and spend the rest of their life as an unskilled worker working for low wages.

This seems so unfair. I wish pundits would get more pissed about this as “fair access” / “social justice”.

The good news is that Lansing State Journal Article shows that the LCC math department (a) understands that there is a problem, (b) understands that failure in math is equivalent to failure in college, and (c) is looking at and measuring their own success and making changes.

Here is data from the article in the physical paper that did not make it into the online version of the article.

The simple summary of the data is that in all situations, the traditional lecture is the best way for kids to learn math at LCC. And the more remedial the level of the course – the more important the lecture format is to student success. If you look at the data a little more closely, you will see that human contact is a big factor in student success. The more students were left to their own to learn the material, the less successful the students were in those courses.

Human contact and learning – wow – who would have guessed? Sage on the stage actually works – wow. The word “lecture” is not a “dirty word”? All the education experts who have never actually taught in a classroom – or perhaps people who claim to be experts but who have never taught students with an SAT score less than 600 – piss me off so badly..

It turns out that this Lansing State Journal Story has a personal angle. Brent was actually in the class that was featured in the story so I have a bit more inside perspective.

Brent has been struggling to get into remedial math for well over a year now and we were very pleased when he was contacted to be part of a one-week “book camp” to help students succeed on the ACCUPLACER math test. To get into the boot camp, you had to have (a) failed the ACCUPLACER several times, and (b) successfully completed college-level courses in other areas. Brent had a 4.0 in his other classes and had done decently but failed the ACCUPLACER three times in 12 months and had one shot left to try it and then had to wait another 12 months to take the test again.

So we were overjoyed to be part of the boot-camp. It was 8AM-noon in the middle of summer so we figured that this would be pretty painful for a teenager – so we got him up and took him for the first few days. After Wednesday he was motivated enough to get up and go to LCC on his own.

The boot-camp teacher was Ms. Hardin who was featured in the article. For the first two days, I took Brent and would wait in the coffee shop with free WiFi but from time to time I would wander by the door of the classroom and peek in. I wanted to talk to Ms. Hardin and ask her some questions but Brent forbade any teacher-parent interaction.

From what I could observe, Ms. Hardin is a wonderful teacher. She loved math, she loved students, she loved teaching, and she was dedicated to student success. Her approach was classic awesome hybrid lecture – she would introduce an idea in the front, give the students a few tips and tricks on how to approach a problem – sometimes a little memory aid (like “around the world” for mixed fractions) and then she would pair them up and have them practice in teams. Brent’s teammate for several days was a retired solder who had some good war stories for Brent about the Iraq. Then Ms. Hardin would have different students come to the board and work problems – and then she would move onto a new topic.

Watching the body language of the students through the doorway – she had them all in the palm of her hand. The students knew she wanted them to succeed and that she cared about each and every one of them – the students knew that what they were trying to learn mattered – after all depending on the outcome of the ACCUPLACER that they would be taking on Friday August 13, 2010 – their lives would change forever – this was their last chance to get into math for 12 months – and a large fraction of those who failed the test Friday would probably drop out of college for the rest of their life. For many this was literally their last chance at succeeding in college.

Each day Brent would come home happy with what he learned and confident. We would do a bit of homework and he knew all the tricks and tips from Ms. Hardin and they made sense to him. He wanted to try everything himself before I helped him. As we did the homework, his technique was always right – but he got the wrong answer on 1/3 of the problems because of a simple math error like thinking 21 was divisible by 8 instead of 7 (remember – no calculators allowed).

The Friday morning was to be a 1-hour review and then the whole class would troop over and take the ACCUPLACER together while Ms. Hardin would wait outside for the students to emerge and she would celebrate with them if they made it into MTH 050. Thursday night the whole family was excited at the possibility that tomorrow morning we would be “in” and Brent’s college could really start in earnest. Everything felt good. Brent promised he would take lots of time on the test and double and triple check his hand-math. He was feeling really confident.

If this were a Disney movie, the story would have a happy ending at this point. But it does not. Brent got exactly the same score on the ACCUPLACER after a week of boot camp that he got two weeks earlier after home study – a 25. You need a 34 to get into MTH050. As best Brent knew it appeared that most of the students had similar results – most still failed (probably like Brent it was their fourth consecutive failure).

From what Brent said, it seemed that Ms. Hardin was disappointed and sad at the results – I am sure that she like the students had high hopes that the boot camp would have a more positive result.

She promised the students that failed that she would try to see if she could get her newly-adopted kids (including the 35 year old soldier back from the war in Iraq) into MTH050. I am guessing that using her instincts as a teacher, she had assessed the students and realized that they were good learners and could handle MTH 050 – particularly if they could have a calculator. Ms. Hardin wanted to open on a new section of MTH050 that she would teach and bring all the students from the boot camp into her section – she wanted to finish what she had started.

Given how much Brent enjoyed learning from Ms. Hardin, this seemed like the most wonderful of possible scenarios. Wow – not only getting into MTH050 without having to pass the dreaded ACCUPLACER – but getting an absolutely wonderful teacher – after all the struggle we have had on this – the thought of going from failure and waiting 12 months to try again to Brent taking MTH050 from Ms. Hardin brings a bit of mist to my eyes as I write this paragraph.

We anxiously waited for the word from Ms. Hardin on the MTH050 decision – Brent checked his e-Mail several times per day and we asked him over and over. About a week later, the bad news arrived that Ms. Hardin could not create the MTH050 section and she could not override her boot-camp students into the class. Again, this is not a Disney movie – this is real-life.

So at some level, it might seem as though we are back at square one – staring at the unblinking, unfeeling, demonic ACCUPLACER test as the hard gateway to Brent getting a degree. We are going back to the drawing board, home study, multiplication table memorization, mixed fractions, waiting 12 months until we can take the test again.

But there is a little hope – Brent has met Ms. Hardin and realized that math can be learned and that somewhere inside of LCC there are caring people who are really good at teaching math. So fighting our way in will be worth it. We are motivated to continue our efforts. We will see where it goes.

But the thing that tugs at me is that Brent is one of the lucky ones. We can afford to keep trying no matter what roadblocks we encounter. With a dad with a Phd. and a mom with a 3.5 in Calculus – this will work out for Brent eventually. For many other students – Friday August 13, 2010 has a good chance of being the day that they walk out of a college building never to walk back in for the rest of their lives.

In my next post, I will make suggestions to LCC as to how I think things should be improved.

Solving a Bug in a Dream – Sort Of

Last night I had a pretty detailed dream about a Sakai bug. In this dream, I stumbled across a bug in Sakai that was very simple and very obvious. It was not as much a code bug – but a bug in how we deployed something.

In a way, I was shocked that we had not already caught this simple mistake. Since it was so simple, I coded up a nice clean fix in the dream, tested the fix, and checked it into the trunk all before I woke up.

Then I woke up and immediately tried to remember the bug. But for the life of me, I could not remember what the bug was. For a while I wracked my brain – after all it was a simple bug with a very elegant fix.

After a cup of coffee and some more brain searching – it did occur to me that I had forgotten to add the feature to make it possible to place more than one instance of the Basic LTI tool in a site in Sakai 2.7.0.

So I made JIRA for the Basic LTI multi-placement feature, coded the two-line change, and tested the it, committed the code, and closed the JIRA over a second cup of coffee.

http://jira.sakaiproject.org/browse/BLTI-69

But I still have no idea what that bug that I fixed in the dream actually was.

Book Review/Summary: DIY U by Anya Kamenetz

Anya Kamenetz was the keynote Speaker at the Sakai conference in Denver in June and at the Blackboard Developer Conference in Orlando last week. I purchased her book (DIY U) at the Sakai conference and had her autograph it, planning to read it later when I had some time. After some Twitter interaction with Anya after the Blackboard keynote last week, I decided it was time to read the book and write a review.

Summary

DIY U is a great book. I have been working for so long in the engine rooms of higher education trying to improve technology for teaching and learning that I have not really been aware of the important changes in higher education in the last fifty years and in particular in the last decade. When you are living it and living through it as a teacher and student, it is hard to see the high level patterns that are going to change us going forward. Anya has done a masterful job of researching, explaining, and summarizing the history of transformation in higher education, the changing economic conditions of higher education, some conventional and not-so-conventional possible evolutionary tracks for higher education.

Her writing style is efficient. Unlike many similar books, there is very little repetition just to pad pages. She tells us what we need to know in 135 pages making good use of her time and our time. Her writing style encourages critical thinking throughout – she will present several different points of view within the same paragraph, making sure to keep the reader’s focus on drawing their own conclusions from the information she presents.

The book chapters include (1) History, (2) Sociology, (3) Economics, (4) Computer Science, (5) Independent Study, and (6) Commencement. I will look at each of the chapters in turn.

History and Economics

Anya gives a nice summary of how higher education has evolved from early times to the present. I found her analysis of the post-war period particularly interesting as this is the higher education that I experienced as a student and became part of as a staff member and later faculty. From my perspective experiencing it, there seemed like very little change from the 1970’s to the present, but in reality there were a number of significant shifts in government policy at the federal and state level funding and policy mix over the past 30 years.

Probably the largest factor that would lead one to believe that change might be imminent is the shift from state funding of public universities through general funds to federal subsidies for tuition through Pell grants and student loans. The continuously increasing federal subsidy for tuition has allowed states to drop their funding (and their influence) in public universities and significant federal funding has masked the pain of tuition increases as long as the federal government pours more money into the subsidies.

The problem as Anya points out is that these subsidies are justified as giving equal access to folks regardless of their economic and social standing. But it is also clear from the research that these funds (and matching financial aid from the universities) are far more likely to benefit the middle and upper class students than the poor and otherwise disenfranchised students. As this becomes more and more obvious, it may erode the political will behind these subsidies.

This is particularly scary for public universities who have had carte blanche for tuition increases because of the historical gap between public university and private university tuition levels. At some point, public universities will no longer be able to roll out a 10% tuition increase that parents and student swallow because the alternative is much higher private tuition and federal subsidies cushion the blow.

The scariest moment may be triggered when public tuition gets within 20-30% of private tuition and the federal government decides to alter how subsidies are given which means that all of a sudden public universities will become unaffordable for the middle class students, perhaps in a relatively short period of time. Public universities (particularly smaller ones) may not have the endowment necessary to absorb the shock of such a change if it happens quickly.

Sociology

Anya gives us ample examples why it is pretty challenging for higher education administrators to “do the right thing”. Most of the motivation arrows point in the wrong direction. As an example:

“… 25 percent of the US News and World Report Rankings come from peer reputation … [and] 75 percent of the other measures come from either direct or proxy measures of spending per student and exclusivity.”

This means that if a university were to find a way to improve the education of a student while reducing costs or admitting less-elite students, it might result in a drop in their all-important US News and World Report rankings. Another good example is hiring a faculty member who with a lot of publications and awards and pay the to be on the masthead and never put them in a classroom as a “perk”. Anya describes situations where a school found itself in a position where they did some market analysis and decided they only way to improve their national image was to increase their tuition so folks would see them as somehow more exclusive. Yikes.

The motivation of traditional public and private universities to reduce enrollment plays directly into the hands of the for-profit universities that have found scalable approaches and are happy to increase enrollments and increase profits.

Another strong theme in the sociology chapter focuses on who gets admitted, who gets financial aid and who graduates. While education is seen by society as an opportunity for all and subsidizing education is generally a widely supported policy, there are some sticky bits when you look closely at the data.

“To put it bluntly, clever and/or middle class children get more schooling that stupid and/or working-class children, and later earn more simply because they have had all the advantages in life, of which education is only one and not even the most important one.” — Christopher Jenks

The overall takeaway from this theme is that nearly all of the policy efforts to level the playing field are better-exploited by those who have less need.

Computer Science

In this chapter, Anya describes a series of case studies and reflects on work being done by innovators inside the higher education system. I like this chapter because in a book about edupunk and DIY-U, it is important to acknowledge the important internal efforts that are beginning to show a lot of promise and moving from the emergent research towards the mainstream.

Since this is an area that I am working in, I think that it is important to exercise a little caution as to the breadth of impact each of the mentioned projects really has in terms of real transformation. It is quite natural when talking to a researcher (myself included) about their project to overstate the claims of breadth of application of their work. Of course the folks in these case studies feel that their work is transformative – but we do need to be a little circumspect and measure the transformative impact from the outside of the projects and over time.

Another interesting topic in the chapter that gave me a bit of pause is the thin thread of funding for much of this advanced experimentation pretty much comes from the William and Fora Hewlett Foundation and Andrew W. Mellon Foundation. Anya points out that most of the funding to look boldly at new ways of thinking about education has come from one of these two foundations. What if the MIT Open Courseware effort was never funded? Where would we be now? The exploration of these possible new approaches to education would have been set back many years if not for the investment of these foundations and their program officers such as Cathy Casterly, Ira Fuchs, Don Waters, and others.

Independent Study

In this chapter Anya talks about the Edupunk and DIY-U movements. Again it is a series of case studies that give a nice view of the different activity in this space.

My own personal feeling is that these are all excellent experiments with very little chance of scaling beyond the trivial but each gives us some insight into what is possible.

In a sense, I am inspired as I read this section and try to imagine the kind of technology that will support these new forms of education. These efforts are experimenting with technologies, content structures, interchange formats, cohort forming, portfolio building, assessment, credentialing , etc. As a software person, it feels like such a green field space to move into – but at the same time it is really foggy as to what will work. It is kind of like the way we were all building our own learning management systems in the mid 1990’s and then a pattern came out which became what we now call Learning Management System (or LMS). What will be the new technology pattern to support this new teaching pattern? Like a vivid dream that you try to remember just after you wake up, I can almost but not quite visualize what this software could and should be.

Commencement

In this chapter, Anya summarizes and reflects on the entire book and does a great job putting it all in perspective. My favorite reflection is from page 132:

The Reformation didn’t’ destroy the Catholic Church, and the DIY educational revolution won’t eradicate verdant hillside colonial colleges, nor strip-mall trade schools. DIY U examples will multiply, though. Most likely in bits and pieces, fits and starts, traditional universities and colleges will be influenced by them and be more open and democratic, to better serve their communities and students. Along the way, we will encounter rough spots, growing pains, unintended and unforeseen consequences – but the alternative is to be satisfied with mediocrity, and insufficient supplies of it at that.

Conclusion

So that brings us to the end of of our “roller-coaster” ride through the past, present and possible future of higher education. Like all good roller coaster rides, it starts with a big hill to climb and a terrifying drop that makes you grab at your stomach and gets you heart racing. Then there are twists and turns and quick changes in direction and at times we even find ourselves upside down and wondering if our iPhones will fall out of our pockets.

But at the end, we arrive back at the station safe and sound and no worse for the wear with our hearts beating faster and feeling more alive and most of all, wanting to get back in line and do it again as quickly as possible.

For me, personally, reading this book makes me think about people who are our leadership in higher education administration in a whole new light. I realize that their jobs are not quite so boring as I imagined them to be and realize that they are quite busy solving problems in a rapidly changing policy and funding environment.

New forms and patterns are emerging and will continue to emerge and those schools that get the new forms right out of the gate will have a leg up for decades.

Note: Favorite Passages

I just want to put down some of my favorite passages from Anya’s book. My copy is now dog-eared, highlighted and has many page corners turned over so I can skip to my favorite passages. I list my favorite paragraphs by page number and paragraph number. I count the first partial paragraph on a page as “paragraph 1”. Sometimes I list a range of paragraphs on a page or across multiple pages.

27-2, 33-3-5, 43-2, 47-1-3, 57-4, 61-3, 72-4, 73-2-5, 75-5, 86-2, 100-3 – 103-2, 103-5, 104-4, 105-1-2, 125-3, 127-5, 129-134

Dr. Chuck .vs. Dr. Mark – Talking About the First Programming Course

Here is my latest entry into my discussion with Mark Guzdial of Georgia Tech about the philosophy and approach to the first programming course both in K12 and in Higher Education.

The best place to view my comments in context is in Mark’s Blog:

http://computinged.wordpress.com/2010/07/13/what-are-we-chopped-liver-cs-left-out-of-national-academy-stem-standards/#comment-3145

Here are Mark’s Comments

Charles, by what definition do you claim “Computer Science is focused on preparing CS professionals who will create technology”? Alan Perlis (one of the guys who coined the term “Computer Science”) argued in 1961 that all undergraduates should take CS, regardless of their major. Jeanette Wing argued in her Computational Thinking article that CS is a good degree to prepare a student for any career. Alan Kay’s “Triple Whammy” definition of CS doesn’t say anything about producing software. Our Threads CS degree, which has “software engineer” as only one of several possible outcomes, is being approved by ABET as a BS in CS degree program.

I’ve seen this definition (implicitly) on the SIGCSE members list, but have not figured out where it’s coming from. Is this a University of Michigan definition?

Here are my comments

There is not a “University of Michigan definition” – it is more the philosophy of the design of our undergraduate Informatics program. I am trying to give you some possible rationale why your desire introduce the notion of a computational model as core part of a K12 curriculum seems to fall on deaf ears. It is pretty common for a focused domain to be so enamored with its core concepts that those in the domain feel that 100% of the educated people in our country must be exposed to those core concepts.

Both you and Alan have done a good job of reducing CS to a few easily described core concepts (storage, representation, processing). While you and (perhaps) Alan think that the elegant expression *makes* the case for inclusion of CS in the broadest of K12 curricula, I would claim that your descriptions make *exactly the opposite case*. Your descriptions make the case that the core CS concepts are not suitable for broad exposure in K12 nor as a single course required for all college students.

You seem to be stuck in the notion that if you had only fifteen weeks of material to present to a ninth grader or freshman that the best use of that time is to lay groundwork for understanding highly abstracted CS notions. You must realize that when you are designing such a curriculum you must impart real knowledge that will truly be valuable to 100% of the educated population assuming no further courses.

So as an example, the Water Cycle is really cool stuff – it serves as a great example to give students a window into science – and also gives them a great skill that helps them decide each day for the rest of their life whether to take an umbrella with them as they go to work or school.

Spreadsheets can be used to graph cool plant growth data and again offer a window into science and being able to enter data and formulas into spreadsheets also be useful in lots of careers.

Spreadsheets and Water Cycle clearly are of great use to all of the educated populace and as such are firmly ensconced in K12 curricula and when there is a required technology course in higher education it certainly includes spreadsheets.

Where you, Alan and I certainly agree is that in this day and age, K12 curricula and broadly required college courses need to explore a much richer and deeper understanding of technology and the mechanisms that underly technology. We all agree that this is rich and lovely material and very stimulating intellectually and also highly useful throughout life.

Where we disagree is the purpose of that first fifteen weeks – either in ninth grade or as that required-by-all.

Your position is that such a course is to be designed so that it is a wonderful prelude to Computer Science and inspires the student to pick CS as their chosen field, choose to go to college, choose CS as their major and spend 45 credits of their undergraduate degree in the required courses in one of the “threads”.

My position is “assume they never ever ever” take another technology course and I only have them for fifteen weeks and that they are paying real money for my course and I want them to come back years later and tell me that my course was one of the most useful courses they ever took in their whole life. (Hyperbole added to make the point).

Interestingly there is a lot of of overlap between courses designed using the two different starting philosophies – both give some sense of data and computation and perhaps even networking – but when I build courses intended for a broad audience, I am trying to teach the lessons in computation as a side effect of giving them a useful and relevant life skill (i.e. like as spreadsheet). The courses designed from your perspective delay the “good stuff” and the “real-world application” because that historically has always came later in a CS curriculum (CS0/CS1 *are* the first in a series of Computer Science Courses that build on one another).

Mark – you are on all the right committees and have the grants and credibility to begin a shift from “the first in a sequence of many CS courses” to a “literacy course that imparts useful life skills in computation”. I am not on those committees and not involved in those projects so my best chance for effecting the kind of change I would like to see happen is to convince *you* and then let you do the hard work :)

The best payoff for an effective and well-designed technology-literacy course is increased interest in Computer Science. At the end of such a course, while all the students have learned valuable life skills, some of the students may have gained a bit of curiosity about how it all really works. Those are the next generation of Computer Scientists.

So the irony, if my hypothesis is correct, is that we will increase overall interest in Computer Science if we teach less explicit CS and more useful technology skills in that all-important first broadly taken course at the K12 and college level. And such a course/curriculum approach would be far more palatable as part of an STEM approach for the next 10 years.

Blackboard Announces Plans to Deliver IMS Common Cartridge and Learning Tools Interoperability 1Q2011

During John Fontaine’s Blackboard keynote Blackboard Developer Conference (BbDevCon), Ray Henderson announced that Blackboard will release support for IMS Common Cartridge and IMS Learning Tools Interoperability by 1Q2011 in their core product line.

John’s Blog: http://www.johnfontaine.com/
Ray’s Blog: http://www.rayhblog.com/blog/

I am pleased and excited because this is an important milestone in the progression of the market adoption of these standards that I am convinced will positively impact teaching and learning in ways we cannot begin to imagine. But in a sense I was not really surprised. Strong support for standards and interoperability is very much in Blackboard’s best interest and for me it always felt like it was only a question of when it would fit into the Blackboard development cycle.

If you think about it for a moment, Blackboard has a pretty diverse customer base due to Angel and WebCT acquisitions and they would very much like to get to the point where they have a single overall learning product with the best features of Blackboard, WebCT, and Angel. That unifying product will naturally be a future version of Blackboard and one of the ways to get people to migrate to the latest version is to give them something in the latest version that they do not have in their current version.

I think that support for IMS Common Cartridge and LTI will be just the right kind of draw (among others) to bring customers forward and together in a positive way.

Beyond Blackboard’s customers, I hope that this is the beginning of Blackboard taking increasing leadership for the entire marketplace in terms of standards and interoperability. Even though Blackboard participated in both the working groups for Common Cartridge and Learning Tools Interoperability (Blackboard is co-chair of LTI), they were not the first to market for either standard. Now Ray has clearly made it a high priority to “catch up” and yesterday’s announcement was an indication that they will catch up pretty quickly.

I am imagining a future where Blackboard becomes increasingly open in what it is thinking about for next-generation approaches to teaching and learning.

While standards like IMC CC, IMS LTI, and IMS LIS are *very important* – they really are only the beginning of the kinds of standards we need to enable a true revolution in teaching and learning.

If we take the model where we go through the dance of (a) vendors create multiple similar proprietary solutions, (b) we realize that this new space is important so we start a standards working group to produce some common subset of the solution that is incompatible with any of the vendor solutions, and then (c) we try to “cat-herd” the vendors to add support for the new standard that is not all that different from the feature they originally built.

This whole process can easily take a long time! Actually if you look at IMS Tools Interoperability where the vendor solutions such as Building Blocks were coming out in the late 1990’s, and the equivalent standard is just making it into the marketplace, it has taken *over a decade*.

As a teacher and a student, wanting to learn and teach in new and innovative ways, a decade is far too long for a working, interoperable feature.

Going forward, we need to engage together looking forward and come up with one, interoperable solution from the *very beginning*. But this means we need to approach new ideas in different ways – the members of the market need to stop looking for win-lose scenarios and stop thinking that “proprietary and closed” is the way to compete – but instead – let the best products simply win without building proprietary APIs, Data Formats, and integration patterns as the first step.

I am optimistic that this recent announcement is only the beginning of engagement of Blackboard in standards and in particular standards around innovative ways to use technology to teach and learn going forward. I am going to do my part to try to bring this new approach into the market – one where we work together earlier rather than later – one where we reduce the time-to-market for standards that enable innovation and increase the quality of those standards as well.

Like a sports team that is in a playoff, I will savor this important and necessary milestone for a day or so, and then it is back to work to figure out how to do this all better and faster. Thanks to Ray and the whole Blackboard team!