Showing posts with label LAK11. Show all posts
Showing posts with label LAK11. Show all posts

15 February 2011

#LAK11 - A MOOC's End

The LAK11 course is not yet completely over - the last week even looks busy and interesting, with various lectures on the future of learning analytics -, but why not make a short round-up already?

A MOOC (or OOC, since MOOC apparently sounds similar to the Catalan word for mucus) is a massive, open and online course. This earlier blog post outlines the characteristics of the genre.



1. The pluses

a. The participation at the course of experts from a wide range of domains guaranteed interesting forum posts and a range of contacts and resources outside the regular course materials.  Learning learning analytics by joining with the learner analysts, in real connectivist tradition.  These "expert participants" formulated critical questions and comments, and thus helped avoid the "group thinking" pitfall in online courses.

b. The course facilitators were excellent in moderating the discussions.  They managed to make the guest lectures and the round-ups interesting for both experts and non-experts.

c. The materials were high quality, with a lot of variation in content and in media and a series of software tools.  The materials were complemented by suggestions from participants. (materials will stay online and available after the course)

d. The open character of the course gives you the freedom to contribute how and when you want.  If you have a lot of time, you can do all the readings and create a summary.  If you're too busy, you can limit yourself to reading a summary from a fellow learner.  A daily e-mail with a short round-up and a few links is sent out daily by the course facilitators.

e. The online character is perhaps an evident point, but I keep finding it amazing that you can participate from Phnom Penh in a course on learning analytics, organized from Canada with speakers from Canada, the US and Europe.

Courtesy Zigazou76


2. The minuses

a. How can such a MOOC be economically sustainable.  Preparing and facilitating the course takes a lot of time, and nobody is paying a course fee.  Offering accreditation on demand (and for payment) could be an option, but I don't think this was an option for this course.  Improved positioning of the organizing university could bring economic benefits maybe?

b. Forum activity decreased strongly after a few weeks.  I'm not sure (since this was my first MOOC) whether this is a recurring trend or because of the more specialized course content during those weeks.  To complete a MOOC however, intrinsic motivation needs to be strong (and remain strong). 

The minuses are much shorter than the pluses, what indicates (correctly) that I found this MOOC a very interesting experience.  As a tool for continuous professional development (CPD) it's excellent. I'm looking forward to other ones...
add to del.icio.us saved by 0 users

#LAK11 - Week 5 - Organizational Implementation of Learning and Knowledge Analytics

A concept on the forefront this week was action analytics.  It refers to the application of academic or learning analytics with clear action component in mind.   Most universities collect a lot of institutional stats, but struggle to provide more than reports.  In an earlier blog post I briefly described the Signals Project at Purdue University.

This project provides an "early warning system" for students at risk of dropping out.  Socio-economic data, educational data mining techniques and (increasingly) social network analysis.

The background for this trend is an increasing demand for higher education, combined with a drive towards increasing cost efficiency and quality.  Action analytics can (among others) help to increase the retention rate (early drop-outs are costly), improve admission and provide a more personalized learning experience (for example through the use of recommender systems, cfr. Amazon).  Action analytics are currently most found in for-profit educational institutions.

Linda Baer from the Gates Foundation recalls in her LAK11 talk the use of action analytics (coupled with measurable targets) as central in the organization's functioning.  Analytics is not limited to the research department, but requires buy-in from all departments (and the leadership) within an institution.  Furthermore, she argues that the use of analytics is not limited within the institution, but that the availability of cross-institutional performance data will grow dramatically.   A driving force is the growing success of "open architecture" approaches, instead of earlier, closed "data silos". For example, Education Watch Online! provides for the USA "state by state" summaries of achievement and best practice examples of schools (Norris et al, 2008).

Finally, the skills, competences and habits of mind, needed to be employable in today's flattening world, cannot be assessed by today's measures of grades and resumes.  The 
use of e-portfolios will rise.  Individuals will be responsible for maintaining a lifelong personal learning record in order to demonstrate their capabilities (Oblinger, 2007).

The 2011 Horizon Report (shortlist) predicts an adoption time for learning analytics of 4-5 years.

Here is a link to the recording of the guest lecture by Linda Baer.
add to del.icio.us saved by 0 users

30 January 2011

LAK11 - Week 3: The Semantic Web: the Web as Database?

Keyword during week 3 of the LAK11 course is the Semantic Web (or, a bit less informative, web 3.0). A bunch of text materials, ranging from the very accessible, to the more technical were all out there to help us grab these sprawling concepts.

Tim Berners-Lee, the father of the internet, recalls in his TED talk (highly recommended ,btw) that he wrote his proposal in 1989 to set up a linked information system out of frustration for his work as a software engineer at CERN.  Being confronted with all kinds of different data formats, information systems and isolated information, he wrote the proposal and the code for the internet.  Today he experiences a similar kind of frustration, the frustration of not finding what he's looking for.  From this  frustration, he advocates the creation of a semantic web, on top of the current web.

                                          Source: The Economist

Read further here (new blog address)
add to del.icio.us saved by 0 users

22 January 2011

# LAK11 Week 2 The Rise of Big Data in Education

Learning analytics is frequently hailed as the ultimate arrival of a data-driven paradigm in education.  The successful application of randomized trials in medicine, for example, could be transplanted to education.  Randomized trials mean that a sample is randomly divided into a group who gets the treatment and a control group who doesn’t get the treatment (but a placebo).  Analysis of variance reveals if the treatment works or not.  David Ayres, a law professor at Yale Law School, shows in his book Super Crunchers that randomized trials already have wide appeal outside medicine.  eHarmony, a matching agency,  uses large data sets and regression analysis to analyze which combinations of personal traits match together in order to present people with a better match.  Internet firms such as Amazon and Pandora, provide entertainment advice based on existing tastes or purchaes.  IBM’s “Smarter Planet” programme, provides plenty of examples of using analytics in city planning or environmental management. 


Drivers are bigger data sets and better data sets, since data are more frequently collected in real time by machines, making them more reliable than data collected by questionnaires.  Collecting and storing them also becomes more cheaply.  In education randomized trials can analyze the effect of certain learning resources, like a video or an animation, on student learning.  Changes in curriculum structure could be analyzed quantitatively.

Ayres points out that randomized trials and regression often do a better job than experts.  Quantitative analysis of Supreme Court verdicts proved a better predictor than the judgments of experts.  A regression model with three variables developed by Orley, an economics professor at Princeton, for predicting the quality of Bordeaux wines predicted outperformed wine experts.

The main reason is that experts can’t quantify the role of variables.   Of course, experts also know that winter rainfall and temperature affect the quality of wine, but they can’t accurately assign weights to their influence.  The more complex the situation, the better the model performs and the worse experts are in predicting.

For all its successes, though, statistical analysis continues to face tremendous skepticism and even animosity. For one thing, Ayres notes, statistics threaten the “informational monopoly” of experts in various fields. But even to many people without a vested interest, relying on cold, hard numbers rather than human instinct seems soulless.

Learning Analytics raises other critiques as well.  Privacy is a major issue.  Do learners have the right to access the data that are gathered about them, or do they have the right to deny that data are collected about them?  For example, students could reasonably be skeptically towards data being collected about their off-task behavior. Another issue is that there might be a tendency to take only those elements into account that can be measured.  This is called the McNamara fallacy, and in its original form says:

The first step is to measure whatever can be easily measured. This is ok as far as it goes. The second step is to disregard that which can't be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can't be measured easily really isn't important. This is blindness. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide.

Some elements in education are difficult to measure, like whether deeper learning takes place, how learners are engaged with the materials or what the longer term effects are of learning interventions.
Predictive models will also do a worse job when the field or subject is evolving quickly.  In Cambodia for example, characteristics of student populations are changing rapidly.  They become richer, more technologically savvy and more technologically literate.  Predictive models in this context would need to be updated very frequently.

Other critiques formulated on the Moodle forum, and summarized by George Siemens, seemed to be more emotionally inspired and I found them less grounded.  Learning analytics doesn’t mean that all complexity is reduced to numbers, nor that ambiguity is no longer accepted.  Ayes pointed out that models not only predict, but also indicate the precision of the model.  If the phenomenon is very difficult to predict, the correlation will say so.  This counters another critique, that models don’t take the uniqueness of humans into account.  However, models and correlations can be wrongly interpreted or uncertainty in the data can be ignored, but this seems hardly a problem specific to learning analytics alone.  It is a misconception that expressing in numbers automatically conceals uncertainty.

This is not to say that statistics cannot be misused, intentionally or unintentionally.  Viplav Baxi provides an excellent overview on the LAK11 forum:

At the very basic level, there are many arguments for or against statistical analyses and other forms of analytics (such as those generated by "intelligent" systems). The arguments address generalizability (do the analytics imply that we can take general actions and predict outcomes), appropriateness (are the analytics appropriate to generate for the domain under consideration), accuracy (did we have enough information, did we choose the right sample), interpretation (can we rely on automated analytics or do we need manual intervention or both), bias (analytics used to support an underlying set of beliefs), method (were the methods and assumptions correct), predictive power (can the analytics give us sufficient predictive power) and substantiation (are the analytics supported by other empirical evidences). 

An interesting quote was formulated by Chriss Lott, who fears that learning analytics becomes the new buzz-word on the block and may spawn another “cottage industry of repetitive pundits”.  Indeed, who will interpret all the data gathered during an online course?  Teachers, tutors, administration, external companies or… the learners themselves?  Tony Hirst expresses the desire that learning analytics would be used creatively to give learners more control on their learning.

To me, two weeks of reading about learning analytics have offered a tantalizing glimpse of its potential, but without forgetting some concerns.  Or, to end with a quote from Bill Fitzgerald, “I would hope we could outgrow our pursuit of silver bullets”.
add to del.icio.us saved by 0 users

15 January 2011

#LAK11 Learning Analytics - Week 1

The first week of the LAK11 course was spent on exploring the field of learning analytics.  Its relations with domains such as business intelligence and web analytics were outlined (Elias, 2011) and its close relationship with academic analytics and educational data mining (EDM).

Learning analytics could be defined as the use of educational data to improve learning.  Course facilitator George Siemens (TEKRI, Athabasca University) puts it this way:

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”


Online learning produces a wealth of student data, such as time spent on certain modules, number of logins, number of posts on forums and their social interactions with other students and tutors.  Academic institutions can use these data to improve learning.  For example, they can provide an “early warning system” and predict which students are likely to fail their exams.  These students could then get extra support (McFadden & Dawson, 2010).  By coupling learning data with socio-economic data and demographics, predictions about student success can be made.  Purdue University’s Signals Project is a flagship case in this regard.


Recommender systems, such as used by e-commerce firms as Amazon are regularly mentioned as one of the potential applications of learning analytics.  For example, based on the sources accessed or links in the social network, students could get recommendations about potentially interesting articles, blogs or people.
Looking for meaningful patterns in large sets of educational data is called educational data mining.  Learning analytics, however, includes using these data to intervene in the learning process, like altering the course content, the provision of support or the use of tools.

 Siemens, 2010

Learning analytics could (should) also be student-centered.  This means that students could be granted access to course data. For instance, they see how much time they’ve spent on various course activities and compare it with their peers.  The question what students want generated discussion on the course forums.  The idea, outlined by John Fritz in his presentation was that students take more responsibility for their own learning and strengthen their meta-cognitive abilities.  They could get access to the data, but it would be their responsibility to interpret it and act upon it.  However, most institutions are still in the phase of collecting heaps of data and analyzing them, without really predicting and modeling behavior, or using it to optimize learning.


"Institutions can't “absolve” students from “at least partial responsibility for their own education. To do so denies both the right of the individual to refuse education and the right of the institution to be selective in its judgments as to who should be further educated. More importantly, it runs counter to the essential notion that effective education requires that individuals take responsibility for their own learning” (p. 144)  
Vincent Tinto, Leaving College: Rethinking the causes and cures of student attrition (1993)

Interesting discussion focused on the issue of distributed analytics.  It is easier to collect data when all student activity is concentrated in one platform or Learning Management System (LMS).  However, when students are stimulated to use a variety of tools, sources and interaction platforms, gathering meaningful data becomes more difficult.  A second issue is the discussion of privacy, in particular when other students also get access to the data.

My first impressions on the MOOC are overwhelming, chaos and quality.  The amount of e-mails and forum posts is staggering and different discussion are taking place simultaneously.  However, it is not really the purpose of a MOOC to participate in everything but rather to be selective.  In that way, a MOOC is a great way to get lectures, information and feedback of some of the leading researchers in the field.  You are stimulated to read the materials and try to make sense of it at your own pace and on your own knowledge level.  A next step is then to create something (like a forum post), share it and get into contact with “likeminded souls”.  We’ll see how that plays out.
add to del.icio.us saved by 0 users

12 January 2011

What a MOOC is like

A MOOC is a Massive (in various degrees of massiveness), Open and Online Course.  One MOOC has started this week on Learning Analytics & Knowledge (LAK11).  Another one, on Connectivism & Connective Knowledge (CCK11) is starting next week.  MOOCs are offered in various domains from education to ICT to biology.  MOOCs are definitely on the rise.  

MOOCs, what are they and where do they come from?

In 2008, Stephen Downes was teaching a class on learning theory at the University of Manitoba. Rather than limit access to his lectures to the 25 students registered for his course, he allowed the general public to attend virtually. The result was that more than 2300 people participated in his course.

First, they are massive.  They tend to attract hundreds or, for some courses, even more than thousand of participants, although some may participate only passively or drop out before the end of the course.

Second, they are open.  This means that they are free, that there are no entry requirements, that there is no formal trajectory that needs to be followed and that all activity is voluntary.  Besides, there is also no accreditation, apart from the appreciation from fellow learners. Taking a course for credits is sometimes offered optionally for a fee. The courses are very participatory, without fixed assignments, but with an invitation to engage in discussions and build networks.  

Finally, they are online.  All activity takes place online, usually through a combination of synchronous (online lectures, discussions etc. using software platforms through Elluminate) and asynchronous activity (blog posts, forums, e-mail newsletters, twitter messages, status updates).   Software programmes like Moodle and tools like Netvibes allow keeping track of all the activity going on.  

Below a short video from Dave Courmier on the essence of a MOOC.





Are they successful?  I'm trying it out, and keeping you posted.
add to del.icio.us saved by 0 users
Related Posts Plugin for WordPress, Blogger...
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.