Getting to know about Business Intelligence


Business Intelligence (BI): Evidence-based decision-making and the processes that gather, present, and use that evidence base. It can extend from providing evidence to support potential students’ decisions about whether to apply for a course, through evidence to support individual faculty/department and staff members, teams and departments, to evidence to support strategic decisions for the whole institution. (JISC, 2011)

I’m in Birmingham today learning about what JISC has been doing in the world of Business Intelligence (BI). The reason being that we may bring this area of work under the umbrella of Emerging Practices to amplify outputs, lessons learned and experiences from the range of projects taking part. Projects include:

Introduction to the day

The day started off with an introduction from Steve Bailey (Senior Adviser, JISC infoNet) and Myles Danson (Programme Manager, JISC), the purpose of the day being to:

  • Reflect on the successes of the programme
  • Explore what challenges remain and explore strategies for dealing with them
  • Consider how JISC/other bodies might help to progress this work

The original call for projects went out in September 2010. Funding of £500,000 was available to fund a number of projects at up to £50,000 each, helping senior managers and decision makers to make better use of both internal and external data in support of institutional management and decision making.

Projects worked in conjunction with the Business Intelligence infoKit, using the BI maturity model to gauge their progress along the way. The maturity model is made up of the following six levels:

  1. Traditional information sources: fragmented and mistrusted
  2. Coherent information: centrally reliable, locally responsible
  3. BI Project: selecting an approach and a vendor
  4. Initial BI system
  5. Growing BI coverage and involvement
  6. Reliable predictions and forecasting

Steve mentioned that the infoKit will be updated with findings from the BI projects and asked projects to make sure they share any ideas on how the infoKit could be improved. Project case studies are due out soon. Steve also mentioned how he is working with the OCU on a white book for BI covering Higher Education (HE) across the Europe. The maturity model has been adopted by the OCU which increases our confidence in the material originally developed for this programme. Adam Cooper (JISC CETIS) also mentioned that Educause are developing a business analytics paper due out in a couple of weeks which will be worth keeping an eye on. Other resources highlighted during the opening session included:

Positives/Negatives/Change

Projects then discussed the positive and negative aspects of their BI projects before discussing what they might have changed if they were to do it again. Points raised during this session are listed below, although this list isn’t comprehensive as I struggled to get everything down.

Positives

  • Enhanced benchmarking capability
  • Breaking out of organisational silos to surface more meaningful connections
  • All relevant data gathered in one application
  • Reputational enhancement
  • Surfaced data quality issues which can then be resolved
  • Awareness of system
  • Process improvements: data contacts; data owners
  • Realistic maturity
  • Better understanding of needs
  • Senior management buy-in
  • Inspired other institutions
  • Selected appropriate reporting tool
  • ‘Agile’ approach with external developers worked well
  • Software visualisation applauded
  • Software developed
  • Involvement of visualisation ‘experts’ helping with the project and wider BI across the uni
  • External validation by peers
  • Have fallback strategies
  • Rapid development can deliver backup targets when required
  • Been able to rapidly develop BI apps
  • Improved communication across organisational units
  • Involvement of wide project team, from range of different sections of uni
  • Better evidence for business case and our management decision

Negatives

  • Data quality issues
  • Data not fit for purpose: duplication; definitions
  • Shifting organisational sands
  • Wrong ‘customer’
  • Senior management buy-in
  • Lacking skills to interpret and analyse the rich data available
  • Challenge of defining data eg research themes
  • Unrealistic expectations
  • It’s my data—unwillingness to share
  • The right data?
  • Business case needs more work
  • Over ambitious targets
  • Because of changes during project—data specification changing
  • Find new ways of engaging staff
  • Long projects are difficult in rapidly changing environments
  • Lack of buy in and change of project director
  • Expectations
  • Skills gap in analytics and structure dispersed expertise
  • Found analytics competencies across HE restricted
  • Software development better suited to short burst projects
  • Timeliness at beginning of project
  • Balancing internal demands for BI vs project

Changes

  • Get relevant stakeholders together at project start
  • More innovative and creative and flexible data visualisation techniques
  • Data quality and audit tools to identify and resolve data issues
  • Integrate data owners more closely with data usage and drivers
  • Make BI more ‘sellable’ to senior management
  • Get right project sponsor
  • Service design techniques (specify what first, how after)
  • Grooming senior champion
  • Organisational development for cultural change

3 Key Changes

Projects were then asked to vote on what they thought were the three most important changes, highlighted in the previous exercise, that might benefit the sector. Each group within the room was asked to discuss one of the highlighted changes. The following is a summary of what they reported back to the room.

Sector Wide Benchmarking Tool

The UEL team provided a quick demonstration of QlikView (I think this is the right version). Using HESA’s HEIDI data set they were able to quickly benchmark their institution against others. As an example they provided an overview of how their staff salaries compared with that of University of Glasgow, followed by Russell Group Institutions. It was very impressive the speed at which data could be visualised, allowing the user to easily compare against other organisations. The table felt that this could be delivered as a shared HE benchmarking resource based on HEIDI data.

The full explanation for how this might be achieved would probably be best suited to another blog post. For now and image of the tables plan might suffice (click the image to view a larger copy):

Balancing demand/expectations vs capacity to deliver

The table outlined how a project was defined, started and then all of a sudden things changed often because of external factors outside the control of the project or because of the institutional understanding of BI. Key to this was an educational understanding of the benefits of BI. At one level people will understand it however to others it’s very much peripheral. If decisions about BI are being taken by senior managers, they have to better understand what it takes to do this type of work. They don’t seem to understand the work that’s required to deliver a prototype let alone a finished product. BI competency/literacies are key too. We need to gain the confidence of senior management and cascade that confidence/understanding throughout the organisation. Focus at a sector level in improving the competencies and skills levels across the sector would be useful. JISC’s role is in facilitating the sharing of information; developing diagnostic tool along the lines of the BI maturity levels; shared service/directory of expertise—potential mentoring roles.

Gathering evidence to support the development of BI within institutions

Good business cases backed up by technological evaluations are required. Evaluation frameworks that are fit for purpose, and ones that fit across different HE domains. Theres a need for good cost-benefit analysis/balanced scorecards. Critical evaluation of the resources that are available. You have to have good project plans and overarching that something about your starting position and how you’re going to get to where you want to be. This isn’t limited to HE, there is a role for the private sector (books, reports, etc). There’s a role for technology providers, however there does need to be a filter here. Other research should be tapped into. Peer networks. There are a lot of existing networks to tap into. There’s a difficult task in bringing disparate sources together. Role for funded research into this area. Longitudinal studies? What will be the impact across the current set of projects over X amount of years. Knowledge curation, a resource to disseminate and share evidence, communication hub, somebody to take a strategic decision and drive this area forward, peer networks and collaboration areas could be useful JISC contribution. Success = evidence based decision making is embedded throughout the organisation.

What might an Emerging Practices curriculum look like for BI?

  • Technology Reviews (Developers/IT Managers)
    • refers to experience
    • Contacts
    • GARTNER Magic Quandrant
    • UCISA List
    • Dynamic
  • Diagnostic Tool to assess capabilities and maturity in BI
    • Strategic ICT Toolkit
  • Managing Change Workshops
  • Analytics Camp
  • Analytical Literacies
    • Visualising, interpreting BI and understanding what happens next
  • Brief introductions to BI technology
  • Key roles and pre-requisites of BI

Upon reflection I think Emerging Practices could help with the technology reviews; managing change workshops; analytics camp; and brief introductions to BI technology. I’d expect to see the upgrade of the BI infoKit include something around the key roles and pre-requisites of BI. I’m unsure how analytical literacies can be taken forward and the diagnostic tool would require some development work.

Key audiences for this type of curriculum would include: developers; IT Managers; VC/DVC; Planning; Local Administrators; Service Managers; Academic Managers; External Data Providers; Government Departments; and Local Authorities.

Another area Emerging Practices might be able to help JISC out in the future is to support institutions in planning their projects pre-funding. One delegate noted how much they had underestimated the work involved and was quite humble in noting how much they had learned along the way. Pre-funding support would have been extremely beneficial, something that ties into research we’ve been carrying out around evidencing impact.

That said, Emerging Practices is still a pilot which we’ll be reviewing in November. We also need to contend with the new JISC strategy and ensure our aims tie in with what’s decided after the review. Interesting times…

Slides from the day are shown below:


Tweets from the day can be found here:

[View the story "#JISCBI Back Channel" on Storify]

2 thoughts on “Getting to know about Business Intelligence

  1. Pingback: Business Intelligence In Higher Education « Myles Danson

  2. Pingback: My Work Week in Words #2 | andystew

Comments are closed.