In this webinar Staffordshire University and the University of Bolton provide an overview of how they’ve applied Enterprise Architecture (EA) techniques within their organisations.
Ray Reid (Senior Learning Development Specialist, staffordshire University) focuses on using EA to transform the management of external examiners, whilst Stephen Powell (Reader in Inquiry-based Learning, University of Bolton) talks through a mixed methods approach used to improve faculty business processes.
The webinar was delivered and recorded on 16th January 2013.
Staffordshire University are providing training on the use of ArchiMate® as a part of a Jisc funded project. Register on one of two upcoming workshops using the links below.
- Enterprise Architecture: An introduction to ArchiMate® – Wolverhampton, Thursday 28th February 2013 (10:00-13:00)
- Enterprise Architecture: An introduction to ArchiMate® – Newcastle, Tuesday 16th April 2013 (10:00-13:00)
The following is a list of resources shared by the webinar’s participants.
- Enterprise Architecture infoKit
- Archi, an ArchiMate® modelling tool
- Nikki Rogers’ blog (Enterprise Architect)
- Nick Malik’s blog (Enterprise Architect)
- An article highlighting Staffordshire University’s experience of using EA
- Enterprise Architecture and the External Examiner approach
- ITANA (Community of Practice)
“Change is endemic in the education sector”
I doubt anyone could disagree with that statement. There certainly wasn’t anyone at today’s workshop that disagreed. External regulations; legislation; funding changes; competition; doing more with less (or the same); new technology; new staff; and new ideas are just a small sample of some of the drivers for change that we face on a daily basis.
The workshop was organised to support projects from the JISC Transformations and Course Data Programmes. 19 delegates were present from across 13 institutions. Coventry University provided the perfect setting for the workshop with change evident from our first view of the campus and some of their new builds.
During the first half of the workshop John Burke (Senior Adviser, JISC infoNet) provided an overview of change management, starting with some of the theories of change. Complexity Theory provides a more realistic description of change across the UK’s education sector. Perhaps this isn’t a bad thing looking at the statement below, how comfortable people are in this environment is another question.
“Success lies in sustaining an organisation at the border between stability and instability.” (JISC infoNet)
Small wonder then that John described the different theories as not requiring you to pick one, but a list from which all would work with different people at different stages.
That’s why it’s important to understand the culture(s) you’re working with when implementing change. Culture is one aspect of change, along with people and processes. When John asked attendees to define ‘culture’, we got a near perfect answer. Culture comprises the:
- shared values between a group of people; and
- history of the individual/team/department/organisation—“the way we do things around here”.
Using a table from the ‘organisational cultures’ section of the Change Management infoKit, you can quickly get an idea of the cultures embedded across your team. Print out a copy of the table for each person. If possible remove the headers, if not ask everyone to fold that first row over so they can’t see it. This stops people from filling out the form in accordance with desires rather than what reflects the current situation. Ask the members of your team to circle one word from each row that reflects the team’s culture. Collect the responses and tally up the results. If you want to be quite detailed about this you might collect individual responses—responses from your team members about their personal preferences. The image below is a snapshot of my (Andrew Stewart) personal preferences, showing that I lean towards ‘Innovative’.
Not only does this give you an understanding of a team’s culture, it gives you a picture of how they currently work. Adapt your methods appropriately to ensure the most effective outcomes. If you need to work more closely with a particular individual you might want to pay closer attention to their personal preferences.
In this context the process, means the various stages people go through during a change initiative. Conner and Patterson (1982) outline 8 stages: contact; awareness of change; understand the change; positive perception; installation; adaptation; institutionalisation; and internationalisation. Angehrn describes a much simpler model using four phases to the adoption of change:
During the first two phases it’s vital to increase confidence, phases 3 and 4 are all about motivation. During a change initiative people will go through a whole range of emotions from denial, fear, frustration, optimism, hope, scepticism, and excitement. It’s important to be aware of the emotions people feel throughout the process to support them accordingly. As people are unique, you are dealing with all emotions and people at all the different stages at the same time.
Leadership and team roles are vital to change management. The following steps were devised by Kotter (1995) and provide a fantastic summary of leadership when managing any change initiative:
- Establish a sense of urgency
- Form a powerful, guiding coalition
- Create a vision
- Communicate the vision
- Empower others to act
- Plan for and create short-term wins
- Consolidate improvements
- Institutionalise the new approach
John also mentioned a 9th step, “Be Tough”. Not in a physical sense, but be prepared to argue your case. With that in mind you might ask yourself the question “do I believe in this” before leading change yourself.
In the afternoon, delegates had the chance to put to practice some of the techniques they had learned during the morning session. EduChallenge presents the user with a real scenario using a fictional organisation. Users can carry out a number of different actions to try and get individuals from the organisation to adopt change. I always worry about this part of the day; will people see the value in playing a game? Simple answer, yes—absolutely. It mimics real life unbelievably well in a short time span. Although delighted with their results and having enjoyed the simulation, one delegate described how they were “emotionally drained”. Managing Change is difficult, it will be draining at times but you have to stay resilient. That’s the reality of it.
- Change is difficult! Be aware of that from the outset.
- Understand the type of change you’re implementing.
- Understand the culture(s) you’ll be working with.
- Lead the change, remember Kotter’s 8 steps and be tough when necessary!
- Use the Knowing-Doing Gap to identify where individuals are on the adoption curve.
- Remember that you are dealing with how people perceive your actions. Best intentions aren’t always seen that way.
- Have a meeting with your project team to share intelligence; who are the key influencers across your organisation.
- If you know of resistance, bring it out into the open. Provide people with the opportunity to ask questions and raise concerns. Be open and authentic when replying.
- Language is critical—inviting people to a ‘meeting’ or a ‘staff development workshop’?
- Put yourself in the other person’s shoes!
There can be few organisations that have yet to wake up to the inherent value of having strong market and business intelligence (BI) at its fingertips. On the face of it, therefore, obtaining buy-in from your senior management team should be a walk in the park.
It is frequently the case, however, that where the greatest gains can be made, through leveraging a solid data warehouse exposing well-designed models, you run into the issue of obtaining the resource required to best exploit the platform you have available. On top of this, the way in which an improved reporting and business intelligence environment often gets introduced can be tricky to manage. All too frequently it is product driven (I saw this reporting software at a conference; it looked great, let’s buy it!), rather than value or needs driven and a lack of clear business value/benefit will quickly derail a good technical implementation. Especially if it is in need of additional resource for scale-out to the wider institution post-pilot.
At the other end of the spectrum it can sometimes be difficult to convince senior management that much can be improved on, after all no one likes the idea that the current reporting environment is deficient in some way, the implication being that there might have been some not so great decisions made on the basis of it.
However, you will mostly encounter those who just will never be excited by the idea of data warehouses, semantic models, or in fact any of the IT implementation side of a BI environment and yet it is important to get their buy-in for the resource required.
My experience, predominantly from the IT side, in implementing a new BI environment has met with characters across the gamut of this spectrum (I’m sure there are some cast-members I’ve missed out here as well) and it’s on that basis that I’ve written this article with the hope that it might provide others with a strategy for getting to that all-important ‘go live’ point, where ‘new’ becomes ‘standard’.
Running a concrete pilot
The most effective way of engaging the cross-section of stakeholders you want on board is by carrying out a bit of ‘show and tell’. The strength of well-constructed and well-visualised BI reports is the immediacy of the data being represented, and in the context of any demo you give the reports being displayed should always be ‘here and now’ and entirely relevant to your audience.
As such, you will want to consider your stakeholder group, consult with the part of the business driving the BI agenda forwards in your organisation and pick, at the very least, a dashboard construction that contains an area of interest for each of the areas you will be ‘pitching’ to, as well as at a minimum one drill-down report to expand on the data you are presenting at the high level.
This achieves two things; firstly it demonstrates that you understand that they are a key consumer of the BI platform you are looking to implement and you recognise their importance. Secondly you are putting their real data in front of them. They should recognise it (hopefully!) and moreover they should be able to immediately spot the improvements in understanding that data by having it presented outside of the usual Excel spreadsheet environment. In other words, you’re showing them how their data can be accessed and represented more effectively.
Who to engage early
If you have a planning department or similar then they should be your first port of call, in fact it is they that should be driving this project from the business side of things and if you can’t get them interested, well, you may be fighting an uphill battle, no matter how good the software you’re looking to implement. They will already know what the key measures are for predicting your institution’s success and will be able to help craft any data sources needed to represent them accurately across the domains you’re interested in demonstrating.
If you don’t have a dedicated ‘numbers and stats’ department, then it will be a process of going back to the source material, examining KPI specifications and rooting out the data. It is certainly more laborious and a more difficult context within which to validate your data; this is something to bear in mind when the inevitable hand goes up with the “that data is wrong” type of ‘question’.
Who to engage in a pilot
Assuming that your planning group are committed to the project it is they who should be able to identify a few data customers most suited to helping you design, build and test (most important this one!) your pilot BI environment. They should be sympathetic to the idea of the project, but some critical engagement is vital, you need your first presentation to have already dealt with the standard “but do we really need *this* data” type questions.
What to include in a pilot
In my institution we piloted department and faculty level dashboards that represented a cross-section of financial and student numbers related information, a total of 6 information panels each clicking through to a drill-down report matched against the University’s Key Performance Indicators.
This suited us well because it was easy to communicate the value to the institution of allowing heads of department easy access to data that had formerly been communicated via Excel spreadsheets and somewhat clunky distribution mechanisms. Being able to construct a single environment that allowed quick and easy access to all department and faculty level KPI data fitted neatly, both with the data needs of departments and faculties, as well as the Senior Management Team’s interest in driving the University forward based on strategic KPIs.
Oh, and of course, pretty pictures.
How to communicate
Initially a presentation was made to the senior management team’s regular meeting. This was definitely a ‘hands on’ demonstration, showing clearly that (a) the system worked and (b) it was already functioning from live data and as such was ‘ready to go’ to a full pilot. There is a definite advantage to presenting at a point where you are only asking for a nod to go ahead and pursue further, as opposed to asking for more resource to be able expand out into wider engagement. Avoid test data – it breeds suspicion regarding the accuracy of the new environment. Also, ensure that you really can just ‘flick the switch’ and let them use it, give them access immediately afterwards and ensure that the communication giving them the access details includes your plan for a wider pilot or initial limited roll out.
Department heads were then introduced to the dashboard environment at their regular meeting and the planning office regularly had update slots in these meetings as changes were made and enhancements introduced in response to queries and suggestions. Frequent drip-feed of information, occasionally with technical expertise on hand to answer any difficult infrastructure and data source questions build familiarity and confidence and across a wider audience who will themselves build the new infrastructure into their day to day reporting. We found that very little encouragement was needed for department heads to start bringing the dashboard content into their internal meetings, but importantly this was because the report content was exportable and viewable across a variety of devices.
How to maintain engagement
It is important to maintain follow up activities both upward and across, especially if you foresee a need for further resource to improve or scale-out your pilot infrastructure. As such, having a fixed horizon for your pilot activity and defining completely the level and scope of roll-out post-pilot will afford you the opportunity of re-presenting the environment to the Senior Management Team in such a way that, on completion of the pilot phase, you can provide data (ideally in the BI environment again, re-emphasising its usefulness) that demonstrates take-up and use of the infrastructure.
Project this into the future, remembering that you will likely already be receiving requests from other areas across the institution, each looking for their own presence (normally from within the sections and areas that the senior team are responsible for) and pitch any resource requirements accordingly.
Provided you are delivering a reporting environment that is delivering good, accurate and accessible content, it should make its own business case, but providing statistics on the differences between heads of department being able to self-serve this kind of BI, versus generated ad hoc reports that must be requested in advance and tailored to each department will demonstrate efficiency gains that will push the point home.
Building on success
Once you have a completed pilot the next steps should be clear: roll-out to all user groups and a defined plan for future expansion. Our future plans focused initially more on rationalising look and feel with the corporate brand (still an on-going project) and providing more of the detailed numbers reports in an easier to access and self-serve format. The dashboard environment is in the process of being combined into a single point of access for both faculty and department level information (it began as two separate dashboard views) and KPIs are being re-worked to provide a more useful indication element to the dashboards. We also planned for training of more power users to produce reports and for the replacement of Crystal Reports infrastructure within the Academic Section more generally, building a self-service centre for operational reports frequently used across the University.
We have an active and highly engaged Planning and Strategic Change office that has driven the project from the business side. They not only pushed the important points in front of key stakeholders at demonstrations and follow up activities, but they also got their hands dirty, were first-hand adopters of the new system and therefore could speak with the voice of experience when talking to others about the value the new environment was providing.
Our institution also has a very adaptive and flexible approach to new IT initiatives, primarily because we have a strong heritage in providing new systems based upon internal resource; self-building and self-sourcing infrastructure and driving forward implementation internally. Institutions that need to go outside to find the skills and expertise required to drive the technical side of a project like this will probably have a harder task acquiring initial resource to produce the pilot stage. The fundamentals of any approach should be similar, but with an eye to on-going resource implications that may not affect institutions where skills already exist, or can readily be trained, in-house.
At my organisation we use the Microsoft Business Intelligence stack, comprising SQL Server, Reporting Services (SSRS) and Analysis Services (SSAS). We expose the results through a corporate SharePoint infrastructure. We’ve been able to drive this all with a relatively tight level of technical resource and with significant input from our Planning and Strategic Change section, which has driven the project forward and up-skilled Crystal Reports knowledge into full-blown SSRS report construction abilities.
Guest Post: Author
Development Manager & Business Analyst
MIS, University of Essex
Today’s modelling bash was a get together of JISC funded projects (and others) to explore the use of Archi to develop Enterprise Architecture (EA) models. Modelling Bashes are self-organised events, so my attendance was very much a one of fact-finding and knowledge sharing. Ian Anderson (Coventry University) introduced the day before enquiring the main reasons for people attending the event which ranged from:
- Using Archi (how and when)
- Communicating EA models back to people who don’t understand EA
- Using EA to develop a systems strategy
Before handing over to Jo Smith (University College Falmouth), Ian made a very important point in that there is a difference between process mapping and EA Modelling.
University College Falmouth (UCF)
Jo then provided an overview of UCF’s EA journey so far. To give some context UCF is quite a small institution. This might give off the impression that it’s easier for UCF to develop EA models however they still have to deliver and manage the same kind of systems as everyone else with much less resource. They have a very small training budget and generally try to make use of existing resources from across the sector. It was nice to hear how grateful Jo was for their involvement with JISC, opening up access to new resources and networks.
Part of Jo’s role was to come up with a better way of managing projects across the organisation to ensure UCF’s investments resulted in the best possible student experience. With some funding from HEFCE, UCF and The University of Exeter developed the ‘Project Enterprise Architecture Toolkit (PEAT)’. PEAT includes documents/standards/user guides/lesson plans/training guides to help people across the Universities manage projects more effectively, and there is a public version available. By using EA, as a tool, it helped UCF to decide what they should/shouldn’t fund. They are also using the impact calculator to manage benefits although still find this a difficult thing to do, a message echoed by many others from across the sector.
More specifically, EA helped UCF to plan: identify what they currently have, and how they should move forward. UCF share a campus with Exeter University and their IT service is a shared service. There are issues around system/service ownership because of this. EA is helping to very clearly map which university is using what services so that they can get the right people involved in the right meetings to ensure they avoid any major system failures.
Jo had brought a printout of their EA Model as it currently stands which the room investigated—worth mentioning it spanned two desks. At this point discussions broke out around what size map to produce. Put simply, it depends on what you want to get out of it. Jo demonstrated the vast range of responsibilities held by her staff and how under resourced they were. In some cases temporary staff were responsible for critical systems. Jo was glad to report they are now on permanent contracts. The problem with Jo’s map was that it was very difficult to follow what linked to what because there were so many connections on one diagram. Jo mentioned that the larger map took a couple of weeks overtime to develop and now that they have it, it’s a lot easier to maintain however it would be useful to have a dedicated resource focused on developing different views.
Smaller maps are best to tell more specific stories. Wilbert Kraan (JISC CETIS) mentioned the 7 +/- 3 rule. Typically speaking, you should only display 4-10 boxes in one view which gives you enough to discuss for approximately 20 minutes.
At this point you might like to download Archi, install it and take a look at some past models (and even more example models). You might find the open day example particularly useful. Save the file from the link, and open it using Archi. Once in Archi, expand the ‘views’ folder in the branch view at the left of your screen. Take a look through the different views available to you. A different model for each view will be shown in the centre of your screen. The image below (click on it for a larger view) is from the ‘motivation view’ which Wilbert demonstrated during the day as an example of how the tool might be used at a strategic level i.e. taking you through the different layers based upon a particular driver such as ‘cut costs’. Wilbert has also produced a tutorial, ‘hands on ArchiMate® with Archi‘, which is definitely worth reading through.
After Jo’s presentation the room broke out into groups to look at EA models in more detail. Mark Joyce (University of Leeds) shared a model that he had been working on, from a very high-level, to show overlaps between various projects being ran by different departments from across his organisation. The whole purpose of the map was to get people to recognise they had something in common and that they may need to talk to one another in order to avoid failure. Again, another great example of how Archi could be used strategically.
Community Principles were previously developed by JISC and members of a practice group it used to facilitate. An Archi File is available too making it easy for others to develop models that can be exchanged between institutions. One of the great things about the modelling bash was that it allowed everyone present some real time to spend trying the tool out and t begin and develop their own models. Having people from a range of institutions together in one room helped to generate new ideas on how the tool could be used and ensured greater benefit from wider discussions.
Business Intelligence (BI): Evidence-based decision-making and the processes that gather, present, and use that evidence base. It can extend from providing evidence to support potential students’ decisions about whether to apply for a course, through evidence to support individual faculty/department and staff members, teams and departments, to evidence to support strategic decisions for the whole institution. (JISC, 2011)
I’m in Birmingham today learning about what JISC has been doing in the world of Business Intelligence (BI). The reason being that we may bring this area of work under the umbrella of Emerging Practices to amplify outputs, lessons learned and experiences from the range of projects taking part. Projects include:
- University of Central Lancashire: BIRD—Business Intelligence Reporting Dashboard
- University of Bolton: Bolt-CAP
- University of East London: Bringing Corporate Data to Life
- University of Sheffield: Business Intelligence for Learning About Our Students
- Durham University: Enabling Benchmarking Excellence
- University of Glasgow: Engage—Using Data about Research Clusters to Enhance Collaboration
- University of Manchester: IN-GRiD
- University of Liverpool: LUMIS—Liverpool University Management Information System
- The Open University: RETAIN—Retaining Students Through Intelligent Interventions
- University of Bedfordshire: Supporting institutional decision making with an intelligent student engagement tracking system
- University of Huddersfield: VoRS—Visualisation of Research Strength
Introduction to the day
The day started off with an introduction from Steve Bailey (Senior Adviser, JISC infoNet) and Myles Danson (Programme Manager, JISC), the purpose of the day being to:
- Reflect on the successes of the programme
- Explore what challenges remain and explore strategies for dealing with them
- Consider how JISC/other bodies might help to progress this work
The original call for projects went out in September 2010. Funding of £500,000 was available to fund a number of projects at up to £50,000 each, helping senior managers and decision makers to make better use of both internal and external data in support of institutional management and decision making.
Projects worked in conjunction with the Business Intelligence infoKit, using the BI maturity model to gauge their progress along the way. The maturity model is made up of the following six levels:
- Traditional information sources: fragmented and mistrusted
- Coherent information: centrally reliable, locally responsible
- BI Project: selecting an approach and a vendor
- Initial BI system
- Growing BI coverage and involvement
- Reliable predictions and forecasting
Steve mentioned that the infoKit will be updated with findings from the BI projects and asked projects to make sure they share any ideas on how the infoKit could be improved. Project case studies are due out soon. Steve also mentioned how he is working with the OCU on a white book for BI covering Higher Education (HE) across the Europe. The maturity model has been adopted by the OCU which increases our confidence in the material originally developed for this programme. Adam Cooper (JISC CETIS) also mentioned that Educause are developing a business analytics paper due out in a couple of weeks which will be worth keeping an eye on. Other resources highlighted during the opening session included:
- BI Survey Results
- Business Intelligence: Monitoring performance and planning improvement
- Analytics Recon (can’t find a link for this one, still in production?)
Projects then discussed the positive and negative aspects of their BI projects before discussing what they might have changed if they were to do it again. Points raised during this session are listed below, although this list isn’t comprehensive as I struggled to get everything down.
- Enhanced benchmarking capability
- Breaking out of organisational silos to surface more meaningful connections
- All relevant data gathered in one application
- Reputational enhancement
- Surfaced data quality issues which can then be resolved
- Awareness of system
- Process improvements: data contacts; data owners
- Realistic maturity
- Better understanding of needs
- Senior management buy-in
- Inspired other institutions
- Selected appropriate reporting tool
- ‘Agile’ approach with external developers worked well
- Software visualisation applauded
- Software developed
- Involvement of visualisation ‘experts’ helping with the project and wider BI across the uni
- External validation by peers
- Have fallback strategies
- Rapid development can deliver backup targets when required
- Been able to rapidly develop BI apps
- Improved communication across organisational units
- Involvement of wide project team, from range of different sections of uni
- Better evidence for business case and our management decision
- Data quality issues
- Data not fit for purpose: duplication; definitions
- Shifting organisational sands
- Wrong ‘customer’
- Senior management buy-in
- Lacking skills to interpret and analyse the rich data available
- Challenge of defining data eg research themes
- Unrealistic expectations
- It’s my data—unwillingness to share
- The right data?
- Business case needs more work
- Over ambitious targets
- Because of changes during project—data specification changing
- Find new ways of engaging staff
- Long projects are difficult in rapidly changing environments
- Lack of buy in and change of project director
- Skills gap in analytics and structure dispersed expertise
- Found analytics competencies across HE restricted
- Software development better suited to short burst projects
- Timeliness at beginning of project
- Balancing internal demands for BI vs project
- Get relevant stakeholders together at project start
- More innovative and creative and flexible data visualisation techniques
- Data quality and audit tools to identify and resolve data issues
- Integrate data owners more closely with data usage and drivers
- Make BI more ‘sellable’ to senior management
- Get right project sponsor
- Service design techniques (specify what first, how after)
- Grooming senior champion
- Organisational development for cultural change
3 Key Changes
Projects were then asked to vote on what they thought were the three most important changes, highlighted in the previous exercise, that might benefit the sector. Each group within the room was asked to discuss one of the highlighted changes. The following is a summary of what they reported back to the room.
Sector Wide Benchmarking Tool
The UEL team provided a quick demonstration of QlikView (I think this is the right version). Using HESA’s HEIDI data set they were able to quickly benchmark their institution against others. As an example they provided an overview of how their staff salaries compared with that of University of Glasgow, followed by Russell Group Institutions. It was very impressive the speed at which data could be visualised, allowing the user to easily compare against other organisations. The table felt that this could be delivered as a shared HE benchmarking resource based on HEIDI data.
The full explanation for how this might be achieved would probably be best suited to another blog post. For now and image of the tables plan might suffice (click the image to view a larger copy):
Balancing demand/expectations vs capacity to deliver
The table outlined how a project was defined, started and then all of a sudden things changed often because of external factors outside the control of the project or because of the institutional understanding of BI. Key to this was an educational understanding of the benefits of BI. At one level people will understand it however to others it’s very much peripheral. If decisions about BI are being taken by senior managers, they have to better understand what it takes to do this type of work. They don’t seem to understand the work that’s required to deliver a prototype let alone a finished product. BI competency/literacies are key too. We need to gain the confidence of senior management and cascade that confidence/understanding throughout the organisation. Focus at a sector level in improving the competencies and skills levels across the sector would be useful. JISC’s role is in facilitating the sharing of information; developing diagnostic tool along the lines of the BI maturity levels; shared service/directory of expertise—potential mentoring roles.
Gathering evidence to support the development of BI within institutions
Good business cases backed up by technological evaluations are required. Evaluation frameworks that are fit for purpose, and ones that fit across different HE domains. Theres a need for good cost-benefit analysis/balanced scorecards. Critical evaluation of the resources that are available. You have to have good project plans and overarching that something about your starting position and how you’re going to get to where you want to be. This isn’t limited to HE, there is a role for the private sector (books, reports, etc). There’s a role for technology providers, however there does need to be a filter here. Other research should be tapped into. Peer networks. There are a lot of existing networks to tap into. There’s a difficult task in bringing disparate sources together. Role for funded research into this area. Longitudinal studies? What will be the impact across the current set of projects over X amount of years. Knowledge curation, a resource to disseminate and share evidence, communication hub, somebody to take a strategic decision and drive this area forward, peer networks and collaboration areas could be useful JISC contribution. Success = evidence based decision making is embedded throughout the organisation.
What might an Emerging Practices curriculum look like for BI?
- Technology Reviews (Developers/IT Managers)
- refers to experience
- GARTNER Magic Quandrant
- UCISA List
- Diagnostic Tool to assess capabilities and maturity in BI
- Strategic ICT Toolkit
- Managing Change Workshops
- Analytics Camp
- Analytical Literacies
- Visualising, interpreting BI and understanding what happens next
- Brief introductions to BI technology
- Key roles and pre-requisites of BI
Upon reflection I think Emerging Practices could help with the technology reviews; managing change workshops; analytics camp; and brief introductions to BI technology. I’d expect to see the upgrade of the BI infoKit include something around the key roles and pre-requisites of BI. I’m unsure how analytical literacies can be taken forward and the diagnostic tool would require some development work.
Key audiences for this type of curriculum would include: developers; IT Managers; VC/DVC; Planning; Local Administrators; Service Managers; Academic Managers; External Data Providers; Government Departments; and Local Authorities.
Another area Emerging Practices might be able to help JISC out in the future is to support institutions in planning their projects pre-funding. One delegate noted how much they had underestimated the work involved and was quite humble in noting how much they had learned along the way. Pre-funding support would have been extremely beneficial, something that ties into research we’ve been carrying out around evidencing impact.
That said, Emerging Practices is still a pilot which we’ll be reviewing in November. We also need to contend with the new JISC strategy and ensure our aims tie in with what’s decided after the review. Interesting times…
Slides from the day are shown below:
Tweets from the day can be found here:
Increasing the synergy between strategy and technology
UK Higher Education, Further Education and Skills are experiencing, and will continue to face, major challenges as a result of political, socio-economic, demographic and technological pressures. The necessity for institutional leaders and senior management to deliver clear institutional vision and corporate strategy has never been greater. Information and Communications Technology (ICT) continues to be acknowledged as a major factor in organisations realising their aims and objectives; consequently, ICT has an important role in the mobilisation of an institution’s strategy.
Funding is down across the sector, organisations are becoming more businesslike, and therefore information is vital. IT is now part of an organisation’s foundations and, as previously mentioned, EA is the perfect world. (David Rose, 2012)
- to develop more constructive alliances
- to provide improved knowledge and insights relating to ICT, and its value to the institution
- to improve decision making relating to institutional and ICT strategy
- to consolidate the research and knowledge already available on this topic
If you’re interested in using the toolkit we’d thoroughly recommend viewing the webinar below, which took place on Wednesday 7th December 2011. It provides an introduction to the toolkit and the experiences of the University of Central Lancashire.
The toolkit itself is hosted by The University of Nottingham: Strategic ICT Toolkit (online) and everything else you need to know is available there. However, if you do have any further questions about the toolkit email JISC infoNet, email@example.com.
A number of currently-funded projects under the JISC Transformations programme are using the Strategic ICT Toolkit. We will summarise their experiences in a later post.
JISC Emerging Practices aims to support UK Higher Education, Further Education and Skills in becoming more strategic in their deployment of ICT across the organisation. Where possible we’re beginning to surface emerging and best practices from across the sector in relation to the strategic enablers outlined within the toolkit: strategic leadership; ICT services; ICT governance; communications and engagement; ICT shared service; and Enterprise Architecture. To date, our focus has been very much on Enterprise Architecture; however consideration of some of the other enablers will follow soon.
In this webinar Steve Bailey (Senior Adviser, JISC infoNet) talks us through the Impact Calculator and how it can be used to measure benefits. Steve also highlights a range of other similar measurement tools available for the sector to use.
The recording is just under 1 hour long and contains input from pilot projects and those attending the workshop.
The Impact Calculator was piloted with six UK Higher Education institutions:
- The University of Aberdeen
- Cardiff University
- The University of Huddersfield
- King’s College London
- The University of Nottingham
- The University of Oxford (Bodleian Library)
A full summary of the pilot project can be found here on JISC infoNet’s website.
We held the second in our series of Enterprise Architecture (EA) workshops in London earlier this week. It was a another excellent day, with 27 attendees representing 3 JISC programmes: Transformations, Course Data and Assessment & Feedback.
The workshop was an ‘evolution’ of our first ‘Doing EA’ workshop, held back in March. A previous post provides an overview of the first workshop and in this post we summarise the second workshop, attempting as little repetition as possible.
What is EA?
In advance of the workshop, all attendees were directed to the recording of our “Introducing EA” webinar, allowing the workshop to get started with a quick table discussion: What is EA? It’s a question we’ve asked before, but one that’s certainly worth asking again! David Rose (who was described in the introduction to the day as “our EA guru”!) asked attendees to briefly discuss their understanding of EA in small groups. Common phrases often associated with EA were used, including: joined-up; a strategic framework; linking business and IT; a way of managing change; a holistic view. A visual overview is captured below:
Of particular note was one description of EA as being transformational, aiming for “the perfect world”, i.e. accepting that EA is inherently aspirational, aiming for ‘Business Modularity’ (as defined by Ross, Weill & Robertson, 2006).
Another important response was that an EA approach “starts conversations”. Effective communication is integral; the greatest stories of success from institutions adopting EA have described how they’ve successfully made important conversations happen. The EA journey of University College Falmouth (read their case study) is an excellent example of this, utilising EA as a key communication tool across the organisation.
“Architecture in general is a response to complexity.”
The Road to Value
The next session introduced the notion that deciding to adopt EA is the start of a journey, referred to as the “road to value”. The position along the road indicates an institutions’ maturity with regards to EA adoption. The five steps in the model are: Explorer, Adopter, Implementer, Achiever, Practitioner.
A specific post about the road to value will follow soon… However, these ‘Doing EA’ workshops are primarily for institutions with JISC-funded projects to join the road and start their journey, hearing the experiences of others who have already travelled some distance.
UCLan’s Journey (So Far)
Lucy Nelson, Project Manager, University of Central Lancashire (UCLan), talked openly about her EA Journey so far, which started around 18 months ago. Her presentation captures her experience of introducing EA to UCLan as well as the issues that have been raised during conversations with staff and the results of the Strategic ICT Toolkit.
Lucy believes in a centralised and ‘top-down’ – rather than project based – approach to EA. However, she also described how there is a general sense of “change fatigue” at UCLan, the institutional structure having changed a number of times in recent years. Her approach has changed more recently to one she describes as “EA by Stealth”, where individual projects are encouraged to “see the bigger picture” and EA thinking is integrated into everyday working. Additionally she recently arranged an in-house EA event at UCLan which had a far greater positive impact than individual conversations.
“People involved in the business (of the institution) don’t have time to get involved in a project … so we took a workshop approach and invited them to attend”
Next we heard from two new projects, both from the JISC Transformations programme, as they start their journey with EA. Peter Hooper from Keele University and Fiona O’Brien from University of Westminster described how their projects relate to institutional strategy/policy and expressed their initial thoughts about how EA may help deliver change. They also both revealed that they can foresee issues and barriers ahead, but felt more confident about overcoming them having already heard about Lucy’s experiences, approaches and successes at UCLan.
“The road to value is also about your own professional journey.”
Challenges and Support
Next followed some more small group discussions, considering three questions:
- How do you plan to ‘do EA’?
- What challenges do you foresee?
- What help and support would be valuable?
Communicating EA was seen to be a major challenge, in particular, how to get buy-in at all levels within the organisation. Indeed this was something Lucy commented on in her presentation, describing the desire for a top-down approach, but the reality being that a “stealth” approach was the best way to get things moving. This is supported by the outcomes of previous JISC projects, describing the approach as Guerilla EA.
Many of the early practitioners talk about implementing ‘Guerilla EA’ whereby they specifically avoid mentioning the term [Enterprise Architecture] at all to senior managers until they are able to demonstrate a success and then explain how EA thinking led to that success.
Resistance to change was another challenge raised in the discussions. By applying Guerilla EA, and finding a project that EA could be applied to, it was felt that early adopters could begin to gather evidence of where their team and organisations benefited from the approach. It’s at that point you might decide to highlight your use of EA which could result in greater buy-in across the organisation. The end result hopefully being the adoption of EA and the wider benefits that may bring.
“The risk of a bottom-up approach is that you come up against someone who has absolutely no appetite for change.”
Apart from a request for more workshops (on ArchiMate Modelling, Change Management, and EA) attendees were very keen to receive support on engaging senior managers. One suggestion was to work more closely with the Leadership Foundation on areas of information technology/management. David offered an interesting point in response; that funding is down across the sector, organisations are becoming more businesslike, and therefore information is vital. IT is now part of an organisation’s foundations and, as previously mentioned, EA is the perfect world.
We’ll endeavour to consider all suggestions for further support raised throughout the day, circulating future offerings (workshops, webinars, resources etc.) via JISC Emerging Practices and through JISC programme channels.
Modelling EA, Archimate & Archi
Wilbert Krann from JISC CETIS, opened the afternoon masterclass sessions, describing the distinction between “EA – the approach” versus “EA – the thing”, where the approach relates to principles and methods and the thing is the model-able organisational structures, business processes, information systems and infrastructure.
His focus is on modelling EA – the thing, and he provided a practical masterclass on ArchiMate with attendees invited to try out some modelling. Using Archi, a tool developed by JISC CETIS, attendees were asked to reproduce a model previously developed by University of Roehampton. (Try it out yourself with the ‘hands on ArchiMate’ tutorial.) Attendees were then asked to create a model based upon a process from their own organisation. Example models are available from past modelling bashes too.
Feedback was extremely positive with most attendees indicating that they’ll continue using the tool when back at base. There was also plenty of interest in arranging a modelling bash, an event where people developing ArchiMate models get together to co-develop and support one another.
Managing EA Adoption: Engaging the organisation
The EA Management masterclass focused on four key themes:
- How to sustain EA? One suggestion was to identify a business owner from the beginning of the project. Where a business owner could not be identified, serious questions should be asked as to whether the project should go ahead.
- Organisational adoption. Using examples, advocates need to demonstrate how using an EA approach is different to the norm. Early adopters need to seek out people who are receptive to this approach and/or those who have a burning need.
- Where EA fits in. Organisations typically employ a range of methodologies, such as ITIL, MSP, and PRINCE2. David noted that EA helps to fill the gaps that exist. Organisations tend to cherry pick steps from the Architecture Development Method (ADM), a key component of TOGAF, to meet their needs.
- Changing the way people think. People need to be challenged to think about their perfect world, without the restriction of current systems.
Similar to the first workshop, the 3E’s matrix was thought to be a useful tool. Lucy described her adaptation which includes stakeholders and people were very interested to see the examples previously captured.
- Presentations: Introduction to EA; UCLan’s Journey; ArchiMate Intro; Management Masterclass.
- JISC Guidance: Archi; Enterprise Architecture infoKit; Enterprise Architecture Case Studies; Strategic ICT Toolkit.
We would like to send out a BIG thank you to all attendees and speakers (listed below). It was an excellent day with a genuine ‘Learning, by doing, together’ buzz in the room. As we publish this post, feedback is coming in thick and fast, including the following two summaries of the workshop from attendees:
“Informative, interesting and enlightening…. I now feel I understand what EA is and how to use the skill!”
“A excellent and participative introduction to EA, with the added benefit of relevant case studies.”
Speakers/Facilitators: Peter Hooper, Wilbert Kraan, Lucy Nelson, Fiona O’Brien, and David Rose.
In this webinar David Rose talked us through the journey one might expect when leading and managing Enterprise Architecture (EA). EA is fundamentally about change, the way organisations and people do things and the behaviours they exhibit – a tough challenge!
The webinar provided a birds eye view of the management journey, highlighting considerations along the way and providing detail based upon the experiences of past JISC projects. If you are undertaking EA, hearing about the journey already travelled by others should help you define your own path!
The webinar was just under two hours long and on the included a brief mid-webinar break. When watching the recording, we suggest dipping into specific sections of relevance to you. Below are start times for the three main sections:
- Engaging Colleagues (57mins—1hr 10mins)
- Managing Benefits & Impact (1hr 11mins—1hr 31mins)
- Governance (1hr 32mins—1hr 48mins)
This webinar was delivered and recorded on 12th June 2012.
The following is a list of links shared by the webinar’s participants.
- Enterprise Architecture as Strategy (Book)
- Everyday EA (Book)
- Introducing EA (Webinar)
- JISC Enable (Project Blog)
- Supporting an Innovative Curriculum in a Traditional HE Environment. Developing a winning strategy to support change at Staffordshire University (Article)
- Experiences of EA (Case Studies)
- First Year of EA at Bristol (Presentation)
- Enterprise Architecture at Bristol (Blog)
- Example Benefits of EA (Google Doc)
- How to Measure Anything (Book)
- Benefits Realisation Management (PDF)
Online Webinar: Open to All!
Building upon themes discussed at our most recent workshop and a previous online webinar, we are pleased to announce the second in our series of Enterprise Architecture (EA) webinars—Doing Enterprise Architecture: The Management Journey.
This webinar, facilitated by David Rose (JISC EA Associate), focuses on people who want to take EA forward within their own institutions and what it might look like in practice. Governance is particularly important but it’s also about demonstrating benefits and impact, as well as developing capability. This webinar will help define the nature of an EA champion’s role and provide a better understanding of what somebody starting out in that position might do next.
Register for this webinar, taking place on the 12th June 2012 between 11:30 and 14:00, via the following link: Doing EA: The Management Journey—Registration Form. (Note that the webinar starts at 12 noon prompt; join early for orientation.)
All you need is a PC/Mac, connection to the internet and a supported version of Java which you can test via this link: Blackboard Collaborate Support. If you have any other special requirements please contact us and we’ll attempt to make suitable arrangements.