Sector RSS Feeds

Currently needs feeds for

HEA ‎@HEAcademy
JiSC @Jisc

  • An error has occurred; the feed is probably down. Try again later.

  • ‘Learners will inherit the future’ - let’s help them
    The current, traditional model of education does not address the needs of a knowledge-based economy. It’s time to seize the opportunity to rethink education and redesign learning. We live in a world where the largest taxi company doesn’t own cars (Uber), the largest movie provider doesn’t own cinemas (Netflix), and the largest social media network doesn’t create content (Facebook).For today’s businesses, agility is everything - and only the bravest innovators flourish.The pace of change presents education with a real challenge. In our efforts to try to close the attainment gap, are we widening the relevance gap? No country wants to offer skills and qualifications for jobs that may not be in demand in years to come. The focus in the UK needs to shift to ensure we equip learners with future-ready skills.Unlearning and relearningWorking with teachers and educators, at South Eastern Regional College (SERC) we have started to reshape learning. In the past, education focused on logic and recall. Now, it’s about helping students thrive in an increasingly complex landscape where change is the one constant. Society shouldn’t simply reward people for knowledge, but for what they do with what they know.Digital transformation is revolutionising industry and changing the ways we work as we move through the fourth industrial revolution, or Industry 4.0. The education sector needs to respond with an innovative approach where technology is central. Jisc calls this Education 4.0.It’s increasingly difficult to imagine how students will succeed without digital skills, and a workforce that has such skills presents opportunities to boost economic growth. Our education system must be sufficiently agile to embrace this ethos, and our teachers must become active agents for change. As educators, we have to unlearn and relearn.  The push and pullTraditional education has focused on ‘push’ not ‘pull’, relying on extrinsic factors, such as achieving a qualification (which may be several years away) to motivate learners. Learners have been told that if they work hard, they will get a good qualification, and then a job. Do students today believe that? At SERC, we have sought to create the ‘pull’ - engaging students by helping them see the relevance of what they are learning.Starting the year with an enterprise fortnight, students work in groups on specific industry challenges, uploading their solutions in a web portal. Peers review these and vote on the most effective solutions. Groups present to both internal and external judges at enterprise fairs. They are tackling real challenges that are relevant to them. That means they start their course believing that what they do has impact.EmployabilityStarting with industry-related challenges, rather than from knowledge to be imparted, highlights the skills and behaviours needed for the careers our learners want to enter and the community they are part of.We want them to see that what they do in college counts and show them they have the potential to be entrepreneurial and innovative. Working collaboratively, showing initiative, dealing with conflict and persisting through challenges while meeting deadlines are essential attributes employers value.Teaching transformedThis has required a cultural shift for staff. Recognising that teachers are active agents for change, SERC has encouraged innovation, assessing across modules, working on interdisciplinary industry challenges and harnessing digital technology.Here, we could see performing arts and engineering students working with computing students on real-world projects, reflecting the experiences they might face in industry. This focus on enterprise and project-based learning has seen learners’ horizons expanding and an enhancement of transferable skills.Staff recognise their role as facilitators rather than just imparters of knowledge. It’s bottom-up, top-down - and it’s not just theoretical; we support staff with peer mentors, development days, and opportunities to create and innovate. We encourage a growth mindset, providing scaffolding and ensuring a whole-college approach. We run weekly webinars: Moodle Mondays (sharing ideas around blended learning), Webinar Wednesdays (sharing good practice across the organisation) and one-minute CPD (microlearning hints and tips).Assessment on the flyAnother key change is that we map assessment against learners’ experiences within the context of project-based learning. Awarding bodies are becoming more open to these ideas, such as harnessing mobile technology so we’re able to capture evidence of students’ skills in the workplace in real time. It is a quicker and more accessible way of authentically documenting skills.Embracing changeOur approach has meant some adjustment, because any innovation can seem disruptive. However, feedback from students is positive and teachers value the opportunity to influence change.Philosopher Eric Hoffer said, "In a time of drastic change, it is the learners who inherit the future. The learned usually find themselves equipped to live in a world that no longer exists". At SERC, we’re equipping our staff and students for an ever-changing world. The questions we base ourselves on now are: what is worth knowing and what will we do with that knowledge? That’s our legacy. That’s our ‘why’.Paula Philpott is head of the learning academy at South Eastern Regional College. Hear Paula discussing the future of teaching, learning and assessment in FE in a webinar, 11:00 on Wednesday 2 October 2019.
  • Drive your data dashboard with analytics labs
    Higher education (HE) professionals will discover the power of data dashboards in a new continuing professional development (CPD) service launching this month. Catherine O’Donnell, who has already sampled analytics labs as a beta service, describes how the programme benefited her. During 20 years in HE as a teacher, trainer, learning technologist and a manager, I had never created my own interactive data dashboards. That changed when I joined analytics labs.As a research and impact manager for widening access and participation (WAP) at Ulster University, I was familiar with working with data to create visualisations, but joining this programme allowed me to pick up a host of new skills.Analytics labs is a CPD programme that brings together teams of data analysts from HE to learn how to use dashboards to help problem-solve some of the key challenges the sector is facing. I first heard about them at a Jisc Connect More event in Belfast in 2017 and was inspired to apply.Meaningful dashboardsAt the time of my application I was chairing a data working group in my institution. Part of our remit was to consider how we should best go about developing meaningful dashboards for WAP.Our main goal was to provide data that would help staff make data-informed WAP decisions to support student success and result in learning gain. We wanted to provide access to a range of WAP data that helps support students throughout their entire learner journey from pre-entry, through their university studies and into employment.[#pullquote#]The labs are designed to allow participants to work collaboratively on a defined project[#endpullquote#]The labs are designed to allow participants to work collaboratively on a defined project that provides opportunities to gain skills in data visualisation, including using Tableau, Alteryx and other tools. It was also an opportunity to participate in agile development, collaborate digitally – with the chance to use tools to enable remote working - and understand the data landscape.When I signed up, I had to agree to dedicating 13 days of my time over three months. The first session was a showcase event which provided an opportunity to meet my team members, who were drawn from several different institutions, and to see outputs from previous participants.I found the dashboard walk-throughs particularly beneficial and was excited to be a part of it.My team was assigned the Research Excellence Framework 2021 (REF21) topic. Initially, I and others did not know much about REF21, but we had a fabulous leader who helped us get up to speed. If I’d been given a choice of topic, I probably would have picked learner analytics, but I found that working on something I wasn’t familiar with didn’t detract from what I gained in taking part.Around a dozen further sessions followed with my team, including face-to-face meetings at different locations with the rest remote. These meetings provided great opportunities to share practice, advice and feedback.Jisc and HESA provided the data we needed and gave us access to Alteryx and Tableau, which allowed us to analyse the data effectively. Lots of support was available on-demand from peers and Jisc experts.Talented peopleTaking part allowed me to meet some talented people with different areas of expertise who were all willing to provide support and share skills. We soon got to know everyone’s strengths and areas of interest and regularly evaluated each other’s contributions.The final session involved us showcasing our dashboard outputs to other groups, including the new cohort joining the programme, and to an expert panel.[#pullquote#]I was super-impressed with the quality and diversity of the dashboard outputs in such a short space of time and left feeling really inspired and motivated[#endpullquote#]I was super-impressed with the quality and diversity of the dashboard outputs in such a short space of time and left feeling really inspired and motivated, with lots of ideas about what I wanted to do next for WAP. I was very proud of what our team, ‘The REFengers’ produced and I was delighted with our feedback.We used data including research income, staff characteristics and postgraduate research student numbers to create the dashboards. These explored the national and institutional picture of REF14, the strength of research environments and made predictions regarding what REF21 might look like.TransformationalFor me, participating was transformational. It gave me opportunities to learn new skills and develop confidence. What I gained personally also benefited my institution. I have since been able to create many dashboards using both external data available in the public domain and internal data for WAP objectives.Some of the dashboards I’ve been able to create using external data include primary and post-primary free school meal entitlements, special educational needs profiles, GCSE and A-level attainment and WAP comparisons. Using internal data has enabled me to create dashboards which are allowing Ulster University to better explore WAP data at university, faculty, school and programme level.[#pullquote#]Using internal data has enabled me to create dashboards which are allowing Ulster University to better explore WAP data[#endpullquote#]Initially, analytics labs pushed me outside my comfort zone and, if I am completely honest, at times I thought I was further behind than I actually was. But when I had time to use Tableau with my own data after the programme ended, the dots connected, and I now use Tableau daily to create dashboards for WAP.When my analytics labs time came to an end, I signed up to become a member of the alumni forum to keep up to date. I was recently asked to sum up my experience of analytics labs in three words - and I picked transformational, inspirational and motivational - and I would strongly recommend this programme to others.
  • One giant leap? Imagining the University of Mars
    I was just a baby when Neil Armstrong took that one small step from the ladder of the Apollo 11 lunar lander to the surface of the Moon on 20 July 1969.  As a kid I often wondered about that giant leap for mankind, which was already starting to seem like something from a bygone era. Just picture how NASA’s Gene Cernan felt about being the last man on the Moon, the last human to set foot on another world. Finally, 50 years later, it feels like we’re back on track at last.As we celebrate the anniversary of the first moon landing, a grand total of just 571 people have been into space. Why is this? Well, until recently space travel has been eye-wateringly expensive, with launches typically costing hundreds of millions of dollars.Now, technical advances from companies including Virgin, SpaceX and Blue Origin are set to dramatically reduce the costs of getting people and things into space. How so? It’s simple, really: land the rocket and fly it again. Figuring out how took a while, though - SpaceX alone spent 13 years working on this.Space is the placeWhile we can’t yet nip off for a weekend break in outer space, within the next decade we will start to see more and more people routinely living and working off-world.[#pullquote#]within the next decade we will start to see more and more people routinely living and working off-world.[#endpullquote#]This will have huge implications for research and innovation because we will need to work out how to build the infrastructure and the industries that will sustain human life in space and on other planets. And yes, one day the first students will attend lectures at the University of Mars.Let’s not underestimate what all this means, though. Everything we take for granted on Earth will ultimately have to be created from scratch on other worlds. From food and drinking water to power generation and creating an interplanetary internet that the colonists and their robots can use to communicate.Picture an asteroid-belt mining habitat, a Lunar hotel (dare you risk a night on the dark side?), a Martian colony. Somewhere off-planet, a seed is germinating, a shoot is growing, a leaf is forming - and a plant is starting to grow…Research that’s out of this worldThis might all seem like something that happens somewhere else, to someone else, but the UK is actually a world leader in space – we just don’t like to shout about it. We made around 40% of the small satellites in Earth orbit right now, and will soon be launching them into space ourselves as Virgin Orbit begins operations at Spaceport Cornwall in 2020. Whoosh![#pullquote#]the UK is actually a world leader in space – we just don’t like to shout about it. [#endpullquote#]At Jisc, where we’re delighted to be supporting the UK’s space innovation community through key digital infrastructure, such as the national research and education network, Janet, and our eduroam wireless roaming. Space is in our DNA, with our Janet team based at Harwell Campus, home of RAL Space and the European Space Agency. Janet also connects key UK space sites such as the Goonhilly earth station.It may be a while before we have interplanetary Janet up and running, but I was delighted that, in a world-first, we were able to demonstrate eduroam over 5G to the University of West London’s vice-chancellor this week.The services we run and the support we provide are constantly evolving to meet the needs of our further and higher education members and customers, and to keep abreast of new technological developments – from 5G to artificial intelligence (AI), augmented reality to the quantum internet.How could new technologies such as artificial intelligence and mixed reality transform research? We’d love to hear from you – take our edtech challenge, and we’ll work with the winners to bring their ideas to life (closing date 31 July 2019).
  • Preparing frontline staff to deal with students in distress
    Would you know what to do if a student launched into an angry tirade in the library, or dissolved into tears during a tutorial? How should you deal with the immediate situation? What is the safeguarding policy? How and when should you contact student wellbeing services?  Frontline staff are often first to see or hear of students in distress, but this can be emotionally challenging. It’s important, therefore, that employees feel confident in their ability to handle such situations and provide the right support for students at the right time. To help, and in response to calls from members, we are just about to launch a free, pilot course. [#pullquote#]members at our stakeholder forum identified student wellbeing as a priority for their colleges and universities[#endpullquote#]Earlier this year, members at our stakeholder forum identified student wellbeing as a priority for their colleges and universities and feedback from our account managers showed a need for staff training to support that theme. We know that many frontline staff have completed Mental Health First Aider training, which is very good at providing an overview of recognised mental health issues, such as depression or anxiety. Our training, which takes a broad look at all sorts of emotional difficulties, as well as mental health conditions, provides the opportunity to build upon and broaden these skills and knowledge, with space to practice and discuss them using example scenarios. [#pullquote#]We aim to boost confidence in decision making, reduce stress reaction and provide a safe please to practice mental health first aid.[#endpullquote#] Our aim is for frontline staff to gain confidence in their existing ability and to develop new skills to support their role, which can also be helpful outside the workplace. We aim to boost confidence in decision making, reduce stress reaction and provide a safe please to practice mental health first aid. We will also be encouraging people on the course to share best practice, so members can learn from each other. Steve James, who has ten years’ experience as a mental health practitioner in the NHS and a national military charity, will deliver the free online pilot training course, which takes place over two afternoons, 23 and 25 July. There are still some places left. For more information, email or call 01235 822242.
  • College applies "any time, any place, anywhere" studying principle to its digital transformation project
    Remember that cheesy 80s advert for Martini which encouraged us to enjoy it “any time, any place, anywhere”? Three decades later, minus the booze, the slogan perfectly captures how our young people expect to study today. Achieving this ideal is only possible with rock-solid IT infrastructure to support resilient connectivity coupled with a large measure of cross-campus digital strategy all under a senior management umbrella. Voila! The perfect FE cocktail!Seriously, though, once you’ve got the key ingredients in place, your students will have the tools to thrive. And isn’t that why we all work so hard – to give our learners the best possible experience and chance in life?Even more seriously, mixing the cocktail is not a defined art and each college and learner cohort will have different needs. My own experience started in September 2016 when I began leading a digital transformation at one of the largest further education organisations in the UK, Leeds City College, with more than 30,000 learners across six campuses.[#pullquote#]The college was having to think of innovative ways to save money and space, while also increase the quality of our offer[#endpullquote#]The college was having to think of innovative ways to save money and space, while also increase the quality of our offer and technology was part of the answer.The power of rewardsIt was certainly a challenge, but it’s paid dividends in so many ways.[#pullquote#]We’ve improved our Ofsted rating, improved maths and English outcomes, extended our student reach, reduced teacher workload and saved money.[#endpullquote#]We’ve improved our Ofsted rating, improved maths and English outcomes, extended our student reach, reduced teacher workload and saved money.We estimate “server” savings of £750,000 because 25,000 students now have access to unlimited storage in Google Drive, and the difference in cost between 4,500 laptops versus the same number of Chromebooks is a huge £1.35m. Money is tight for us all in FE, and for us, these changes were a no-brainer.The college has also climbed from an Ofsted report stating “requires improvement” to – in 2018 – “good, with outstanding features”. Previous feedback noted we were using too many digital tools and platforms, which was confusing, so we decided to focus on Google’s production and collaboration toolkit, G Suite, which has features to help teachers be more innovative and less tied to admin tasks.The result is a shift in pedagogy and delivery to more independent and online learning, and increased accessibility, with that flexible “Martini mentality” approach to study.Sell it!Achieving successful change in a large organisation, however, requires a culture shift. You can expect push-back from some critics, usually those who say “but we’ve always done it like this”, and others will bury their heads in sand, but don’t be afraid to be tenacious in your approach.Winning over the naysayers at every level is essential because every member of staff needs to be invested in giving our learners the best possible experience.[#pullquote#]technology won’t replace teachers, but teachers who use technology will probably replace those who don’t.[#endpullquote#]We’ve all seen the headlines about robots taking our jobs, so there’s a fear factor to overcome too. It sounds harsh, but teachers must get on board; technology won’t replace teachers, but teachers who use technology will probably replace those who don’t.My advice is to package the use of edtech as a means of saving time on admin and other mundane tasks, leaving teachers space to help those students who need most support.Support for learnersAnd giving students more support was behind one of the first physical changes we made to the college as a means of breaking down barriers to learning. We knocked down walls to create huge spaces where we set up “independent learning zones” and “break-out zones” to encourage study outside the classroom.In these areas we provide access to Chromebooks, while maths and English teachers are on hand to give one-to-one help to those taking resits. We soon found an improvement in digital literacy and research and study skills, and the focus on English and maths paid dividends too; since we started this model in September 2016, results have risen from below to above the national benchmark.Staff developmentFor students to gain maximum benefit from technology, staff need to be comfortable and confident with digital tools and we provide a range of training options, from short, sharp bursts of 20 minutes to full-day workshops, all delivered in a variety of ways – face-to-face and online, with live streaming an option too.At sessions on campus we provided refreshments and small prizes, and those who do really well (and are nominated as such by their peers) are rewarded with a visit to one of the Google offices. We try to make it fun!Want to know more? Steve is talking about the Leeds City College digital journey at our Connect More event in Manchester on 27 June. Can’t make it? Steve has an open-door policy, so you’re welcome to visit. Contact him by emailing
  • How do monographs fit with the open access agenda?
    In the UK, the push towards open access (OA) monograph publishing dates back to at least 2013. That was the year the Wellcome Trust included monographs and book chapters in its OA policy and the former higher education funding body for England, HEFCE, posed a number of questions relating to open access monographs in its Research Excellence Framework consultation. The trend since then has been clear. The revised guidance on Plan S, the international push towards OA mandates, published on 31 May, states that:“cOAlition S will, by the end of 2021, issue a statement on Plan S principles as they apply to monographs and book chapters, together with related implementation guidance”. The four UK HE funding bodies have signalled the intent to mandate OA for monographs submitted to the Research Excellence Framework beyond the 2021 assessment and UK Research and Innovation (UKRI), a signatory of Plan S, has launched its own open access policy review, which will include monographs and book chapters.Therefore, by 2021, the major UK funders will have implemented policies and mandates on OA monographs, joining a growing international list. But each country in Europe is at a different stage of enabling OA for monographs. As yet, there is no unified solution for this transition, but we can learn from each other, coordinate activities and help to build a better system.[#pullquote#]By 2021, the major UK funders will have implemented policies and mandates on OA monographs, joining a growing international list[#endpullquote#]The Knowledge Exchange (KE) partnership of six national organisations in Germany, the Netherlands, Finland, France, Denmark and the UK is behind an effort to provide just such coordinated support for open access monographs.One of the key questions that remains is how to encourage more authors to publish their monographs OA. The Jisc KE survey (pdf) revealed that concerns over sustainability, copyright and third-party rights, quality issues and trade and crossover titles are high on authors’ agendas. It is key to engage authors in a debate around these issues.[#pullquote#]One of the key questions that remains is how to encourage more authors to publish their monographs OA. [#endpullquote#]In May, the partnership published its latest report, towards a roadmap for open access monographs (pdf). The new report includes recommendations and best practices around the themes of policy development, author engagement, technical infrastructure and the monitoring of OA monographs. The UUK monographs working group (of which Jisc is a member) has recently reported on two events, the first for arts and humanities learned societies and subject associations, the second was a workshop for publishers.There is a major concern among authors that funder mandates will require them to publish OA without the funding to do so. For example, a model where book processing charges (BPCs) are made to the author, funder, or institution to cover the publishing costs. It is important to explore different types of models and to think carefully about the pros and cons of each of them and the effects this might have on the scholarly monograph. A single business model is not desirable for a diverse ecosystem of OA publishing. What we need most are policies supported by sustainable business models and clear paths for researchers to apply for the necessary funds.[#pullquote#]To encourage further take-up and to support policymaking it is also important to articulate the benefits of open access monographs. [#endpullquote#]The KE initiative and the recent flurry in funder activity show that open access is important to the future of long-form scholarly communication. To encourage further take-up and to support policymaking it is also important to articulate the benefits of open access monographs. To this end, Jisc has started a new project (part of our open metrics lab) that aims to show how mining references from OA monographs can help researchers to understand new research fields.The project is also put in context by a review of existing related initiatives (pdf). We are also very pleased to be part of the Community-led Open Publication Infrastructures for Monographs (COPIM) partnership led by Coventry University, which has been awarded £2.2m by Research England to improve and increase OA publishing.[#pullquote#]For many academics, OA might present an opportunity to rethink their approach, rather than viewing it merely as something that is happening to their research disciplines.[#endpullquote#]For many academics, OA might present an opportunity to rethink their approach, rather than viewing it merely as something that is happening to their research disciplines. However, there is still much work to do in defining policy and addressing misunderstandings and concerns.  Join our free event, OA monographs: policy and practice for supporting researchers, taking place on 4 July 2019 in York. 
  • Governing body encouraged to take responsibility for cyber security
    As the number and sophistication of cyber incidents increases, senior managers are under growing scrutiny to provide evidence showing how their businesses are protected. According to the latest Cyber Security Breaches Survey from the Department for Digital, Media, Culture and Sport, two-thirds (65%) of medium to large businesses identified at least one breach or cyber attack in the last 12 months. It is therefore no wonder that nine in ten (89%) directors or senior managers say that cyber security is a high priority.But too often cyber security is managed solely by IT departments, which makes it difficult to join up the overall governance of digital services. Cyber risks affect wider operations and should be included and addressed by the governance and management processes across the organisation.[#pullquote#]too often cyber security is managed solely by IT departments, which makes it difficult to join up the overall governance of digital services.[#endpullquote#]The risk can no longer be delegated away from the governing body, and the executive management are more accountable than ever for cyber resilience and the costs incurred by cyber crime. Being able to produce evidence of appropriate action taken to protect the business will be key in meeting the expectations of an organisation's stakeholders and regulators.The recently introduced British Standard 31111 was developed by the  BSI Risk Management Committee  to help top executives better understand and manage the cyber risks to their organisations. Assessment against this standard is a new service offered by Jisc’s cyber security team, which carried out one of the first high-level applications of this standard at CERN, the European Organization for Nuclear Research in Switzerland. Can your organisation meet the BS 31111 cyber security standard?You’ll be well on the way if you can comprehensively answer the following four questions:What are your levels of cyber risk and what are the levels of investment to mitigate them for each department?What is the level of prevention and response capability available to manage a cyber incident?How does your organisation manage and understand change across the cyber landscape?What resources (eg financial, human, information, technology) are needed to meet the principles and objectives defined in your cyber risk management and resilience policy?BS31111 audit and assessment is a component of our wider cyber security assessment service.
  • Is your college future-ready?
    Ahead of his talk at the Aoc/Jisc Technology Summit on Monday 17 June, Robin Ghurbhurun encourages FE leaders to prepare for Industry 4.0. Artificial intelligence (AI) is increasingly becoming science fact rather than science fiction. Alexa is everywhere from the house to the car, Siri is in the palm of your hand and students and the wider community can now get instant responses to their queries. We as educators have a duty to make sense of the information out there, working alongside AI to facilitate students’ curiosities.Embrace innovationOur sector should not simply try to keep up with the latest innovation. Fads come and go. We have all experienced technology disruptors - the written word and books were at one time considered a threat to power through knowledge. Video didn’t kill the radio star, it made radio stars more accessible.Both these examples provided educators with the sensory tools to enhance engagement and create experiential learning platforms. Now, we all have unimaginable instant access to data and information - and that’s a game-changer.'Always on'Ignorance is inexcusable. Today’s Gen C students are always connected - in their friendship groups, for their news services, their social arrangements and engagements. We would be foolhardy to ignore that.[#pullquote#]Instead of banning mobile phones on campus, let's manage our learning environments differently[#endpullquote#]Instead of banning mobile phones on campus, let's manage our learning environments differently for the betterment of all. Let’s set boundaries, decide which tools to use, ask when technology can help and when it isn’t appropriate. It’s our duty, as leaders and teachers, to understand where our students are sourcing their information from, how credible it is, how they can apply it, and how they can benefit from it now as learners and in future as employees and employers.The realisation of 5G will unleash further the ubiquity of access, placing increasing demands on facts, reality and appropriate application. Are you ready?Change for goodAs we are on the cusp of the second decade of the 21st century, technology should be making education more accessible, and it should be helping our teachers teach. We must use it wisely to narrow the social divide, not widen it.[#pullquote#]We need to plan strategically to avoid a future where only the wealthy have access to human teachers, whilst others are taught with AI.[#endpullquote#]We need to plan strategically to avoid a future where only the wealthy have access to human teachers, whilst others are taught with AI. We want all students to benefit from both. We should have teacher-approved content from VLEs and AI assistants supporting learning and discussion, everywhere from the classroom to the workplace. Let’s learn from the domestic market; witness the increasing rise of co-bot workers coming to an office near you.We have to think carefully about our pedagogy moving forward, and scrutinise the role technology plays in preparing Gen C for their future employability.Infrastructure is still the priority. At Richmond upon Thames College, we aim to integrate Internet of Things (IoT) gateways to support devices - from wearables to learning and customer analytics. This ‘big data’ environment will be supported through a high-density wifi infrastructure across our new campus.We’re also thinking about how people will engage with technology, personalising each student’s learning experience, and using augmented reality maps to help students navigate the building so they always feel comfortable and confident.Walking the talkPreparing successfully for the fourth industrial revolution - Industry 4.0 - is about understanding what digital transformation means to you and your community. How are you, as a leader, walking the talk? How can developments such as augmented reality and AI improve your stakeholder experience?Strategically, our collective journey must be the shift from where colleges are currently in terms of learning analytics to where we want to be around cognitive analytics and artificial intelligence. Most crucially of all, can you define what success will look like?[#pullquote#]We must start with transforming the culture of our organisation to be digital-first.[#endpullquote#]We must start with transforming the culture of our organisation to be digital-first. In the past, the education sector has been behind the curve with technology – and, to an extent, that’s been ok. Today, we have reached some limitations. To be prepared for tomorrow, we should equip our organisations to leap from Education 2.0 to Education 4.0. Robin Ghurburhun is delivering a talk, ‘leading and driving technological change through an intelligent campus’ at the AoC/Jisc Technology Summit at Google in London, 17 June 2019.
  • How technology can help your brain work smarter
    The brain is a more powerful learning device than any piece of technology. In his talk at the AoC/Jisc Technology Summit, Alex Beard urges delegates to take human intelligence seriously, developing technology that supports our capacity to learn. Your head contains a learning device far more powerful than a computer. The 100 billion neurons of the human brain are each connected with as many as 10,000 others, creating a neural network of incredible power. Where a typical laptop carries out a rapid one billion operations per second, your brain does an unfathomable 2.2 quadrillion. That’s 15 zeros.Despite this, in our digital era, we’re increasingly obsessed with improving our tech, inventing AI that can do everything from recognising faces to radiography, chess-playing to bot-chat. That’s exciting. But in frantically celebrating the power of technology, we risk undervaluing human intelligence. Humans are born to learn, and our brains are incredible learning devices.[#pullquote#]in frantically celebrating the power of technology, we risk undervaluing human intelligence.[#endpullquote#]Three steps to upgrading your potentialWhat would it mean if we took our natural, organic intelligence more seriously, using our capacity to learn as a starting point and building technology around that? Too often, we take the tech as a starting point.In order to avoid becoming more stupid, losing our ability to think, or forgetting what we know as we’re sucked stealthily into the machine zone, here instead are three ways to upgrade ourselves:First, invest equally in exploring human and machine learning. Since IBM’s Deep Blue beat Garry Kasparov at chess in 1997, artificial intelligence (AI) has taken off. But so too have our own minds, not least in their ability to invent ever more powerful AI. Under the right conditions, your human intelligence grows hand-in-hand with new technology. There are some fantastic examples of this all over the world, being used in schools and universities.Today, platforms such as Century Tech in the UK and Squirrel AI in China adapt learning, creating a precise diagnostic profile for each individual, and sharing that information with teachers to help them be even better in their practice. It’s impressive tech, but it only functions successfully in the hands of human teachers and learners.Second, resist user-friendliness. Your tech isn’t interested in making you smarter, but only in your harnessing your attention with a continual flow of zaps, nudges, buzzes and bleeps. Sure, it’s fun – like eating candy for a kid. But just as those delicious sugary snacks have no nutritional value, neither is there learning value in what apps are feeding you.Ecole 42 is a self-guided coding university based in Paris that is taking this insight seriously. Remarkably, the university has no teachers. Everything is done online and students have to work out their own path and collaborate with their peers. It’s an interesting example of what tech can do when learners have to figure things out for themselves.Third, don’t automate human intelligence, augment it. Your tech wants to make life easy for you, outsourcing map-reading, calculation and memory. But your brain thrives on challenge. Just as when you go to the gym, you don’t get any fitter unless you feel that burn, so you can’t expand your mind without some intellectual heavy-lifting.There are tools that help humans to do that lifting. I recently met a teacher in Finland who uses fairly simple technology – such as Google Sheets and vlogs - to deliver content in his classroom. He casts his role as a coach to give pupils feedback on their abilities of perseverance, creativity and co-operation. The tech is in the background, being careful to leave the thinking to the students and teacher.[#pullquote#]remember that you are born to learn[#endpullquote#]Finally, remember that you are born to learn. Right now, machines are making you stupid, but what might the future might look like if we use tech wisely and train people in the right ways? If you follow these rules, it’s a simple equation: tech plus brain can make you smart.Alex is speaking at the AoC/Jisc Technology Summit in London on 17 June 2019. Registration is now open.
  • Ministries of magic: now we can all be tech wizards
    As augmented and virtual reality is becoming increasingly accessible and affordable, why not consider adding a little magic to teaching and learning? Do you remember when the internet and the web were the preserve of wizards and gurus, while regular folk went about their lives largely unaware they existed? Can you remember the shiver that went down your spine the first time you saw your first ‘www.’ in the wild?For many of us in tech at the time, it felt a bit like the parallel universes of the wizards and muggles in the Harry Potter stories – with the delight of being part of an underground movement with its own sigils and shibboleths. That’s where we are right now with virtual reality (VR) and augmented reality (AR), on the cusp of another giant leap that could transform our lives as much as the web did, if not more.The early web over promised and under delivered, as anyone who ever had to wait for their modem to connect will remember. So many false starts and reboots as the internet evolved from a research network into a public service. Local loops were unbundled, fibre pulled, and web technologies developed to the point where complex apps, such as Google Maps, became possible.Today, we’re at an equivalent inflexion point with virtual and augmented reality, as common software platforms, including Apple’s ARKit and Google’s ARCore, appear, and headset and video capture hardware is becoming commoditised.Putting yourself in someone else’s shoesAn area that’s particularly interesting in education is technology that help learners experience situations from other people’s perspectives. Let’s say you’re a student thinking about going to college, contemplating course and career choices, or just planning a holiday – now you can use virtual reality to see through the eyes of someone doing everything from visiting Machu Picchu to fixing the hybrid engine on an electric car. And, with 360 degree video capture, you can create your own immersive experiences using just a camera, a mobile phone, some free software and a £5 cardboard headset.Perhaps even more profoundly, we can use this technology to help people better understand the lived experience of someone who is not like them. Picture experiencing life as a climate refugee, an autistic person, or someone from a different race or religion.The Open University’s Virtual Inclusion project (below), is a great example - an interactive experience that lets you get into the shoes of three schoolchildren facing discrimination, aiming to help to build empathy and social inclusion.Launch the world of Virtual Inclusion experience.[[{"fid":"9599","view_mode":"default","fields":{"format":"default","field_additional_information[und][0][value]":"Virtual Inclusion AI launch screen","field_rights_owner[und][0][value]":"The Open University","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":false},"link_text":null,"type":"media","field_deltas":{"1":{"format":"default","field_additional_information[und][0][value]":"Virtual Inclusion AI launch screen","field_rights_owner[und][0][value]":"The Open University","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":false}},"attributes":{"height":487,"width":710,"style":"width: 350px; height: 240px;","class":"media-element media--right file-default","data-delta":"1"}}]]If you are viewing this on a desktop PC, you will need to move your mouse around to experience the full 360 degree view. You can choose between three different characters each facing social discrimination in their lives. At the end of each, there will be a choice for you to make as to how to respond to this discrimination either as a viewer or the character.To find out more or try the mobile app version, which you can use with a virtual reality headset, visit the Open University’s Virtual Inclusion project page.What’s next?Our latest edtech challenge (running until 31 July 2019) is looking at ideas for how immersive technology could be transformative - in this case looking at research practices. We’ve also recently kicked off a Jisc project looking at how immersive technologies can help with teaching and learning, and what we can do to help support our members.To help catalyse the conversation we created Natalie_4.0, an immersive virtual experience of what a day in the life of a student ten years from now might look like. We’d like to hear from you about how you think these technologies could add a little magic to teaching, learning and the student experience.Catch Jisc’s session on ‘virtual and augmented reality in education’ and meet Natalie_4.0 at Connect More events UK-wide until 4 July. Registration is open now.
  • How can research benefit from increased spending in R&D?
    The UK government has vowed to increase its total R&D expenditure to 2.4% of GDP by 2027. With this ambitious target in sight, now seems a good time to pause and reflect on where to focus investment to support the fourth industrial revolution and big data research. Funding for research and development (R&D) is increasing around the world as governments, companies and academic institutions strive to remain competitive in a world increasingly shaped by technology. Ranking in the top ten of countries’ spend on R&D, the UK has pledged to step up by setting an ambitious target to become the most innovative country in the world and increase its total R&D expenditure to 2.4% of gross domestic product (GDP) by 2027.Infrastructure firstFor the UK to be at the forefront of data-oriented research, the sector must have a highly robust and secure network powered with industry-leading technology that can scale to support bandwidth-intensive technology like genome editing and the Square Kilometre Array. As one of the busiest, fastest and most secure private networks of its type in the world, the Jisc-run Janet Network provides the UK’s research and higher education sectors with access to very high-speed, reliable connectivity, with in-built cyber security.  [#pullquote#]To attract international investment in UK R&D, talented researchers need to be linked with industry partners and the right infrastructure.[#endpullquote#]To attract international investment in UK R&D, talented researchers need to be linked with industry partners and the right infrastructure. For example, the development of 5G testbeds will enable partnerships between researchers and the private sector, such as a 5G and internet of things project in Worcester involving Malvern Hills Science Park, the Local Enterprise Partnership, Worcester County Council, Mazak, Bosch, and QinetiQ.The importance of investing in research infrastructure as well as developing digital skills is also highlighted by the Campaign for Science and Engineering (CaSE) in their latest report, 'Building on scientific strength; the next decade of R&D investment'. Jisc welcomes their recommendation that the government should front-load investment in these areas to ensure UK R&D capability can sustain growth.Sharing economy through connectionsJisc continues to work to connect science parks to the Janet Network as a means of fostering such useful relationships between research and business.Fortunately, through Jisc, the government is committed to continuing the strategic development of the Janet Network, in the context of the forthcoming UKRI research and innovation infrastructure roadmap, alongside services supporting research data management and further enhancements to security and access control.[#pullquote#]Developing the economy to be more data-driven also has the potential to reduce the gap between the least and most productive companies within any sector.[#endpullquote#]Developing the economy to be more data-driven also has the potential to reduce the gap between the least and most productive companies within any sector. This divide is a growing problem for the UK and puts the brakes on our national productivity. However, a dynamic data-driven economy is only achievable if foundational datasets are open or available with very low barriers, within legal and ethical constraints. There needs to be a step change in the availability of open data and of data scientists. Otherwise, the benefits will only accrue to big businesses able to access these resources. Universities are already exploring greater collaboration by sharing their state-of-the-art equipment with each other, the research community and with industry – from anechoic chambers to wind tunnels, mass spectrometers to supercomputers.Skills to understand large data setsThe sector is also increasingly focused on open science/data wherever possible, and in training highly skilled data analysts, software and modelling experts. For example, analytics labs are run several times a year, bringing together data analysts to work on key topics and challenges in the HE sector to provide data-driven solutions. In a small way, this project will contribute to developing the talent the UK needs.[#pullquote#]only 4% of companies report having the right people, tools and data to be able to draw meaningful insights from data.[#endpullquote#]According to the Confederation of British Industry (CBI), only 4% of companies report having the right people, tools and data to be able to draw meaningful insights from data.At the moment, postdocs from the hard sciences are the most likely to obtain the skills and knowledge to work with large data sets. The well-documented loss of those postdocs from the academic sector is a problem, even if it benefits the wider economy. Science increasingly depends on the skills of frontline scientists, but also other roles such as those supporting complex research infrastructure. Some argue that talent lost from academic research is partly due to excessive burden, affecting their wellbeing.Support from private sectorAnother point that needs to be addressed with regards to the increased R&D investment is that the funding needs to be shared between the public and private sectors. Indeed, the lesson from other countries is that it’s not possible to achieve a rapid increase in the percentage of GDP going into R&D unless the lion’s share of the contribution comes from the private sector.This concern is also voiced by the CBI in its recommendation to focus on tax credits, which is an effective way to leverage funding with help from the private sector. There are other public spend measures that ‘crowd in’ private R&D funds. One of these is the Higher Education Innovation Fund scheme that backs development of a broad range of knowledge-based interactions between universities and the wider world.Open science, too, is a part of this picture, for example the world-leading work of the Structural Genomics Consortium. As diverse public-private research collaborations emerge to meet the aspirations of the UK industrial strategy, demands for relevant skills, secure infrastructure and a more healthy research culture to support them will surely grow.Any significant policy initiatives affecting the way university-based R&D is funded should take these burdens into account, especially in an environment where the outcomes of the Augar review may impact negatively on university finances by cutting tuition fees.If universities are increasingly driven by commercial demand, the Haldane principle - the idea that decisions on how to spend research funds should be made by researchers rather than politicians - will come under considerable strain. Otherwise we risk jeopardising the UK’s undoubted success in research. 
  • Is your college ready for the digital revolution?
    Last month, a report for the Further Education Trust for Leadership (FETL) assessed digital skills in the further education (FE) sector and found them wanting. Leaders must address the strategic and operational impact of technology, the report said. It’s a timely warning – especially in light of the government's edtech strategy, which places digital innovation at the heart of education policy. But how can colleges drive change while preserving their identity and delivering the skills the UK so urgently needs both affordably and effectively?[#pullquote#]The sector is crying out for a bold, clear plan to drive such change, enabling colleges to look to the future. [#endpullquote#]The sector is crying out for a bold, clear plan to drive such change, enabling colleges to look to the future. There’s a real need for simplicity and flexibility in the FE system, rather than the current complexity. We need something that everyone understands – from employers to learners and parents. We need clarity over what FE skills training does and the possibilities it affords employers. And we need to offer clear advice and guidance to help learners make the right choices.[#pullquote#]there’s a perception that policy-makers, experts and influencers are doing a lot of head-scratching that amounts to little change – and certainly doesn’t bring additional funding.[#endpullquote#]"All talk no action" is a criticism I’ve heard a lot lately. Across all regions of the UK, there’s a perception that policy-makers, experts and influencers are doing a lot of head-scratching that amounts to little change – and certainly doesn’t bring additional funding. A quick glance at the list of commissions and inquiries currently gathering evidence seems to support these concerns. No wonder there’s a high level of frustration with the existing system.It will be a challenge for colleges to adapt, but as head of further education and skills at Jisc, I help institutions identify how they can innovate to meet the specific needs of their learners and community. We share our experiences and highlight technology that is already working for other providers and in other industries. That offers reassurance so that FE leaders can innovate without taking drastic risks.It’s about recognising when technology has a role to play. For example, learning to spray-paint a car or visiting an oil rig can be expensive. Simulating that experience in a college environment using virtual reality technology is a safer, easier and more cost-efficient approach with a lower carbon footprint.[#pullquote#]Technology-enhanced training is increasingly necessary as the fourth industrial revolution – the digital revolution – disrupts the workplace.[#endpullquote#]Technology-enhanced training is increasingly necessary as the fourth industrial revolution – the digital revolution – disrupts the workplace. Demand for FE will increase as lifelong learning becomes the norm and greater numbers of people look to upskill and reskill throughout their careers. Some jobs will change too, due to advances in AI and automation, adding further demand for FE and skills training.Colleges mustn’t fear this brave new world. While the pace of change is fast, community and place remains central to the FE space. Bricks and mortar are no less important today than they’ve ever been, despite the rise in distance learning and online programmes.[#pullquote#]Blended learning, allowing colleges to deliver flexible courses that combine on-site and off-site study, is gaining traction[#endpullquote#]Blended learning, allowing colleges to deliver flexible courses that combine on-site and off-site study, is gaining traction – and having somewhere to study where learners come together with other people is crucial. Providers must identify their technology needs in this context and start a cycle of innovation and investment. The colleges of the future must be digitally fit for purpose.Paul McKean is a member of the steering group for an inquiry by the Skills Commission, Policy Connect and the Learning and Work Institute (and funded by FELT) that will investigate the FE provider base, employer needs, and the implementation of national policy. He urges colleges to submit evidence to the inquiry, via Policy Connect, to help shape the future of the skills system. The deadline is 27 May 2019. Meanwhile, FE practitioners interested in developing their digital capabilities can attend Jisc’s Connect More events around the UK, 4 June – 4 July 2019.
  • Creating the library of the future
    Libraries and learning resources services have embraced digital practice over three decades. Lis Parcell reflects on their pioneering approach and considers how libraries will continue to reinvent themselves When did academic libraries first embrace digital technology? The answer might be earlier than you think.Experiments first took place in the 1960s and 1970s, when librarians looked to harness ‘library automation’ to free staff from repetitive tasks and enable resource-sharing. The 1980s saw the expansion of library networks and microcomputers, while CD-ROMs put new searching power in the hands of researchers.For UK universities, the pivotal point was 1995 with the launch of eLib: the Electronic Libraries programme. This extensive programme had a major impact on the culture of libraries and their take-up of technology, and we can trace eLib’s pioneering, collaborative ethos right through to present-day innovations, such as a project Jisc is currently collaborating on to transform Jisc's library support services.Colleges with a visionBy 2000, further education libraries - often termed learning resource centres or LRCs - embraced digital innovation.It was a key moment for me, too: after seven years’ immersion in digital library innovation at Swansea University, I took a role in Wales with one of 13 FE regional support centres. These were set up across the UK to help colleges – newly connected to the Janet Network – to maximise the internet for learning and teaching. College librarians were eager to seize opportunities for digital innovation and I developed enormous respect for their adaptability.I admired their dedication to improving the lives of students, often juggling teaching and learning technology roles alongside more traditional learning resources management.A culture of innovationI believe it is largely thanks to libraries’ culture of cooperation and innovation that they have defied predictions of their imminent demise, becoming centres of digital practice. They provide digital content and the systems to discover, access and share it. They facilitate information literacy and digital skills. And they use technology creatively to engage their communities.[#pullquote#]digital library practice is shaping new learning spaces, enriching student experience and making research possible.[#endpullquote#]Today, digital library practice is shaping new learning spaces, enriching student experience and making research possible. So why is the library’s role in digital capability so often underestimated? It could be to do with the outdated images of libraries in the popular imagination. I suspect it could also be because libraries often introduced digital improvements in a seamless way, minimising disruption for students and staff. It’s an approach that puts people first, rather than placing technology at the forefront.Navigating AI, big data and ‘fake news’As we approach the fourth industrial revolution, the role of library staff in digital practice and leadership won’t stand still. As well as navigating changes in digital content and scholarly communications, libraries will position themselves in relation to trends in artificial intelligence, the internet of things, and big data - Education 4.0. In a world of ‘fake news’, librarians’ expertise in information and digital literacy is crucial.There are ongoing challenges to improve user experience too, making digital resources more accessible and engaging. We also need to ensure organisations understand the vital role libraries can play in learning, teaching and research.[#pullquote#]We need to ensure organisations understand the vital role libraries can play in learning, teaching and research.[#endpullquote#]Fit for the futureAs a subject specialist, I support members through a mix of advice, consultancy and staff development. I do this as part of a small team focused on digital practice in learning and teaching, working with other subject specialists in areas such as strategy and infrastructure.I’m excited to be working with libraries and learning resources services as they continue to evolve, seeing our members working creatively - with and without technology - to get fit for the future. And because every member and every library is different, I know I’ll never stop learning.Lis Parcell is a Jisc subject specialist - digital practice (library and learning resources services). She is delivering a keynote at the Working Smarter in a Time of Change conference at Teeside University, 3-4 June. Look out for her at this year’s Connect More events too, 4 June - 4 July.
  • From atoms to bits – edtech strategy sets the stage for Education 4.0
    Earlier this month, the much-awaited education technology (edtech) strategy for England was launched with a speech by education secretary Damian Hinds at the Schools and Academies Show 2019. The focus of the edtech strategy is on how tech can be used to make a positive difference in schools, colleges and universities. What changes might be envisaged as the new strategy beds in? In this article, I’ll try to map a path between the direction of travel set in the edtech strategy and the vision of  Education 4.0 that Jisc is developing with members.As Wonkhe's David Kernohan notes, it’s 14 years since the previous edtech strategy, and a lot has changed in that time. Not only has the internet become pervasive and all-encompassing, but the bulky beige boxes we used to have to use to get on to it have shrunk into svelte slabs of glass and aluminium.The importance of literacy and numeracyApps and websites have proliferated to fill just about every need; instantly, and on demand.But education isn’t like that; tight budgets can mean that equipment is used until it breaks, and let’s face it, nobody wants their kids to be experimental test subjects. And until the skill pill is invented, we have to face the facts that it takes time to learn stuff. And everyone learns in their own way, at their own pace.  Nowhere could this matter more than literacy and numeracy, where, in spite of numerous well-intentioned policy interventions, the UK still struggles to get a third of its 11 year olds up to speed. And for some learners the outcomes are far worse – at a school near me, two thirds of pupils fail to reach expected standard at Key Stage 2.[#pullquote#]job roles that might once have been regarded as wholly vocational have increasingly been academicised[#endpullquote#]At the same time, job roles that might once have been regarded as wholly vocational have increasingly been academicised, with degree-level entry requirements now the norm in areas like nursing and midwifery.As I said in my evidence to the Education Select Committee inquiry into the Fourth Industrial Revolution in January, there is a crisis brewing here - literacy and numeracy underpin the digital skills required by the near-future industries that will be the backbone of the UK’s economy in future decades. And those ‘expected’ levels of literacy and numeracy are also becoming essential for everyday life, as the high street and public services alike move from the world of atoms to the world of bits – from bricks to clicks.[#pullquote#]it’s important to acknowledge that there are millions who risk being left behind - and need a leg up[#endpullquote#]So, while digital skills are seen mainly as an opportunity for people to find their way into new careers and even new industries, it’s important to acknowledge that there are millions who risk being left behind - and need a leg up. The British Chambers of Commerce found that three quarters of UK businesses already had a shortage of staff with key digital skills like word processing and spreadsheet editing, and a survey by Barclays found that nearly half of UK adults lacked these core digital skills.Commoditised computing and pervasive connectivity have already transformed so many areas of our lives; when did you last plan a journey on a paper map, consult a printed timetable, or visit a travel agent to book a holiday? Education has been changing too, but much more cautiously.Ten grand challengesThe edtech strategy focusses on immediate practical steps such as helping to build the evidence base for effective edtech and helping educators to learn from their peers. At the same time, it sets out ten grand challenges on which edtech companies and educators are encouraged to collaborate, supported by a £10m innovation fund.These challenges include reducing teacher workload by at least two hours a week and showing how technology can facilitate flexible and part-time working.[#pullquote#]Do we take cautious incremental steps or bold leaps into the future? [#endpullquote#]We’ve heard from Jisc members that Industry 4.0 technologies like artificial intelligence (AI) and augmented reality (AR) have huge potential in education, with AI especially starting to be used in everything from admissions to assessments. Looking ahead, perhaps the key question now for our policy makers and institutional leaders is really about risk appetite. Do we take cautious incremental steps or bold leaps into the future?Read all about our Education 4.0 vision and how members can get involved.  
  • Opening up immersive technologies to education
    Using immersive technology, student nurses can perfect their stitches and criminals see the consequences of their actions. In this post, I explore today’s practical applications of virtual and augmented reality (VR and AR) to see how it may benefit education. Using new electroencephalogram (EEG) technology, which records brain activity, it’s possible to drive a car using your mind or make an object on a screen smaller just by thinking about it. There’s so much to be excited about with this and other emerging technologies, but in education it’s only really the champions – the early adopters – that are currently benefiting. It doesn’t have to be that way.[#pullquote#]The EEG technology that’s used to do those mind-blowing things is easy to set up[#endpullquote#]The EEG technology that’s used to do those mind-blowing things is easy to set up, and there’s no need for power, so it has myriad uses. A VR simulator, for example, could be used to train cabin crew on an aeroplane that’s flying through turbulence, while the EEG technology measures stress. Trainees might be very anxious the first time, but learn to manage their stress effectively by the fifth rotation. This gives a better understanding of what people are experiencing, which could improve training methods.Healthy preoccupationsWe've worked with AR at the University of Manchester Medical Centre role-playing practical examinations. Medical students must show they’re competent at communicating effectively with patients, taking blood, administering antibiotics and stitching wounds. AR is used to give them an understanding of the equipment, reinforcing how to use it properly with limited supervision.[#pullquote#]we’ve worked with AR at the University of Manchester Medical Centre, role-playing practical examinations[#endpullquote#]Other organisations are doing great things in this space, too. A problem in medical schools is that students have limited access to cadavers, so a company called Medical Holodeck is using simulated patients with real MRI and CT scans from anonymised patients to allow students to diagnose and recognise conditions virtually.Immersive technology is also being used to help humanise experiences. For example, nurses could be placed into a virtual situation to practice giving bad news to a patient - then the situation is flipped, so the nurse can embody the patient.Research shows that we empathise more if we can experience a situation such as homelessness through another's eyes. It’s the same for criminals: someone who’s been arrested for racist behaviour could use VR to place themselves in their victim’s position. This method has been shown to reduce the amount of racial bias, at least in the short term.Learning from Pixar, Disney and AppleIn the futures team, we spend time talking about good practice across a multitude of areas. I met people at Pixar to see what we can learn from their processes, about the dynamics between teams and the technologies that they use to streamline their approach in filmmaking.[[{"fid":"9385","view_mode":"default","fields":{"format":"default","field_additional_information[und][0][value]":"Matt Ramirez at Disney Pixar","field_rights_owner[und][0][value]":"Matt Ramirez / background shows scene from Disney Pixar's Finding Dory","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":false},"link_text":null,"type":"media","field_deltas":{"1":{"format":"default","field_additional_information[und][0][value]":"Matt Ramirez at Disney Pixar","field_rights_owner[und][0][value]":"Matt Ramirez / background shows scene from Disney Pixar's Finding Dory","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":false}},"attributes":{"height":263,"width":350,"style":"font-size: 13.008px;","class":"media-element media--right file-default","data-delta":"1"}}]]Disney does a lot of research in immersvice technology too, looking to monetise its computer generated assets beyond films, in areas such as VR, allowing experiences to prevail across multiple viewing mediums. In addition, Disney has developed Project Cardinal, a tool to dynamically translate initial scripts to VR for draft pre-visualisations.And when Apple recently launched its AR toolkit (ARKit), I was able to talk to software engineers directly. I told them about Jisc, showed them what we’ve done and discussed how they might link with education.  [#pullquote#]I told [Apple] about Jisc, showed them what we’ve done and discussed how they might link with education[#endpullquote#]AR and VR are often best as 20-minute supplementary experiences. They don’t usually work in isolation, but instead complement conventional teaching and learning approaches. These technologies are efficient in communicating subject matter that is traditionally theoretical and abstract. That’s where I think immersive experiences will take over in learning in future – especially when we can deliver them in any browser, device or wearable items. That’s when massive waves of people will adopt AR and VR  in education.A pedagogic backboneWe’re currently putting together an AR/VR immersive technology service. This has been ad hoc before now, so institutions could collaborate with us, but there wasn’t a nailed-down offer. We’re now developing advice and guidance, pulling together a framework of companies to recommend to our members.Often, what we see in institutions is that the first project acts as a catalyst. For example, some of the VR work we did with Preston’s College spawned a number of other initiatives, which the college was able to scale up internally. We showed what was possible with the technology at a low cost, and the college staff took it from there.[#pullquote#]We’re looking to solve problems, not just show shiny new technology[#endpullquote#]Now, we’re collaborating to develop immersive tech with a pedagogic backbone. We’re looking to solve problems, not just show shiny new technology – and now, as well as working with champions - we’re reaching out to other institutions that want to dip a toe in the water. It’s not about short-termism. We aim to look beyond the horizon and explore how future technologies can potentially disrupt the education space.Find out moreIf you are interested in learning more, you can book a place on our training course: introducing augmented and virtual reality in education, which takes place on 14 May at Weston College in Weston-super-Mare.
  • Why ethical debate is crucial in the classroom
    As digital technology transforms our world, computer scientists must consider the ethical impact of their work. In her powerful Digifest workshop, Miranda Mowbray illustrated why this is so important. Here, she shows how universities can keep up with the pace of change. Life was different in the 1970s. In those days, computer scientists didn’t, for the most part, have to make difficult ethical choices as part of their jobs. They were in basements tending to machines, making sure their code worked well. Today, they’re in boardrooms making decisions that may affect democracy.Because of the rising power, influence and importance of computer systems – which are embedded into pretty much every aspect of modern life - they now have greater capacity for producing good and bad outcomes. Social media, for example, exerts a big influence on the way we work, play, interact, and on our politics. That’s exciting for computer science.The world has changedBut it means that our teaching has to change. At Bristol University, I've been giving guest lectures on ethics for computer science. Students asked for this topic to be in the curriculum.The British Computer Society (BCS) has interest in it too. In order to gain BCS accreditation, in fact, computer science degrees are now required to have content on legal, professional, social and ethical issues. But the method of teaching it is important. I have seen ethics courses that just say ‘do this, don’t do that’.[#pullquote#]it’s important that students learn to apply ethical reasoning themselves[#endpullquote#]In the workshop I ran at Jisc’s recent Digifest event, I explained that it’s important that students learn to apply ethical reasoning themselves, so that when they come up against a new ethical issue in their professional life, they can analyse it independently.Part of the problem is that this requires discussions, and computer science students don’t have a reputation for liking discussions. Also, it’s really hard to build efficient computer systems that output the right results and don’t crash all the time.Learning to do that at university takes at least three years of work, so it’s understandable why traditional computer science degrees just teach the technical skills. But we need to develop ethical reasoning and communication skills too.'All interesting ethical questions are dilemmas'I start by telling students that there may not be a single correct answer to a discussion question. If anyone shows a view which is unpopular, I say that’s fantastic; we have a disagreement. I did this at the Digifest workshop too, because even Jisc’s delegates – many of whom are lecturers and educationalists – may be reluctant to openly express an opinion that others may not share.I try to hold myself back from revealing my own opinions upfront too, so as not to intimate people who think otherwise, and to allow an exploration of different ideas. All interesting ethical questions are dilemmas with arguments on both sides. If you disagree with someone, in order to persuade them of your point of view, it really helps to see where they’re coming from; and if you can do that, if your opinion was actually wrong, you may be able to discover that.[#pullquote#]The ability to have respectful, rational discussion with people with whom you disagree is a highly transferable skill[#endpullquote#]The ability to have respectful, rational discussion with people with whom you disagree is a highly transferable skill. It’s important for life as a citizen and as an employee.I draw on real-life examples in my class. I saw the video of Christopher Wylie talking about what it was like to be the research director for Cambridge Analytica, and how he now strongly regrets what he did. He thinks it was very unethical. But he was under huge pressure to deliver at the time, and he was only 24!Computer science graduates can very quickly find themselves in a position where they’re affecting the quality of democracy, and they may have no training and no support in this ethical decision-making position. We talk about these dilemmas in my classes.For example, in some judicial systems, after a criminal has been convicted, in order to help a judge decide whether they need to be locked up or not, a machine learning algorithm is used to predict whether or not they are likely to reoffend. That machine prediction is more accurate, on average, than predictions made by humans in the justice system. Should we leave the decision entirely to the algorithm? Most people say no. Why?Looking beyond the lawEthical considerations need to go a lot further than just asking whether or not an action is legal. Some things are commonly considered unethical but aren’t actually illegal. Laws tend to say whether something is permitted, but ethical analysis can indicate which of two or more options would be better, and so can help improve systems that are already OK.[#pullquote#]Ethical considerations need to go a lot further than just asking whether or not an action is legal.[#endpullquote#]One of the three main ways ethical philosophers have suggested to tell whether an action is good is to see whether it conforms to rules. But there are two others: look at whether it’s in line with positive values; and consider the likely outcomes for stakeholders. I encourage students to use all three ways of looking at an ethical problem.Although there are arguments on both sides of interesting ethical questions, that doesn’t mean that it’s all relative and just a matter of opinion. Students need to be able to reach a decision on whether something is good or bad, taking into consideration the arguments on both sides. They should talk about ethical questions with their peers, and with their colleagues when they’ve moved into industry. If there is a potential ethical issue in a company, it’s easier to address it as a group than as a single employee.Too often, computer scientists feel that discussing ethics isn’t part of their job. Well, it is – and it’s fascinating and important.
  • The perils of big data
    Speaking at Jisc’s forthcoming Networkshop 47 conference, Kieron O’Hara warns that even anonymised data can reveal sensitive personal information. We must ensure data is both safe and useful, he says. Anonymisation is controversial. Even if a dataset is nicely anonymised, if someone comes along with extra information, they may well be able work out who is who.For example, you may have an anonymised medical dataset, but you know that the prime minister has been in hospital over a certain period of time for some mysterious ailment. You just need to look out for a woman of Theresa May’s age and add in the common knowledge that she has diabetes. If diabetes is mentioned in the dataset you can, with a reasonable degree of probability, identify the prime minister's visits.The data environmentAnonymisation is an ongoing process, because as more data gets published - on the web, for example - it may become easier to crack a dataset. The anonymisation that was perfectly safe two years ago may not be adequate now.[#pullquote#]anonymisation that was perfectly safe two years ago may not be adequate now.[#endpullquote#]The environment data sits in is crucial too. Who's got access to it? What other datasets might be relevant? What are the security and governance arrangements? Anonymisation is about manipulating these aspects, as well as taking out obvious identifiers such as addresses and names. The problem nowadays is that there is so much information around that almost anything could potentially be an identifier.With this in mind, you've got to be very, very careful as to how you manage personal data, how you anonymise it, and what context it sits in. We call this environment-sensitive method functional anonymisation.So I might say, for example, that you can see some data, but you have to come my offices and analyse it on a standalone computer without access to the internet. Or I might not let you see the data but say, if you give me some queries, I'll send the answers back. Our approach at UKAN isn’t about making data 100% safe, because that's just not possible. What we’re trying to do is reduce the risk of anything going wrong.Assessing the risksThe trouble with GDPR is it tends to produce a box-ticking mentality. The focus is on compliance, not on reducing risk. The easy solution is to simply decide not to share your data with anybody. That's totally safe – or, at least, as safe as your security systems are. But then, there's a lot of value in the data that's not being exploited.[#pullquote#]The focus is on compliance, not on reducing risk.[#endpullquote#]So how do we balance utility against risk? At UKAN, we look at what we want to achieve and do with the data. Then we ask questions, such as what's the minimal amount of data I need? You start to tailor your requests and think about the risks.Suppose I share data and an intruder manage to get a copy. What could other people do with it? What would that cost them? Would it take an immense amount of processing power, or could they do it on a simple laptop? We’re thinking hard about the data and the potential impact of a hack.Protecting people, not 'data'Ethical and responsible data stewardship is about taking the risks seriously - not only to yourself, your company and the liabilities that your company might find itself with, but also to the people who are represented in the dataset themselves. They’re significant beings, not simply data points. Then there’s your ethical responsibilities to wider society. If data can be used for social good, look for opportunities to share it in a way that will provide social gains.[#pullquote#]If data can be used for social good, look for opportunities to share it in a way that will provide social gains.[#endpullquote#]I would like those who are worried about GDPR to think more positively about the possibilities of sharing data. Meanwhile, those people who have been taking a lot of risks with their data might want to think rather more seriously about it.I want to communicate the sense of an ethical framework that applies not only to data, but also to the people that the data is about.[[{"fid":"8376","view_mode":"default","fields":{"format":"default","field_additional_information[und][0][value]":"","field_rights_owner[und][0][value]":"","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":"Networkshop47 logo"},"link_text":null,"type":"media","field_deltas":{"1":{"format":"default","field_additional_information[und][0][value]":"","field_rights_owner[und][0][value]":"","field_resource_home[und][0][value]":"","field_other_license[und][0][value]":"","field_file_image_alt_text[und][0][value]":"Networkshop47 logo"}},"attributes":{"alt":"Networkshop47 logo","height":247,"width":300,"style":"width: 121px; height: 100px;","class":"media-element media--left file-default","data-delta":"1"}}]]This blog is based on an interview from the Networkshop magazine 2019. Delegates can hear Kieron's presentation on data anonymisation at 15:30 on day one of Networkshop, on Tuesday 9 April, in Lecture theatre 2.
  • What can we learn from the Myspace data loss?
    A major data loss by file sharing platform Myspace is a timely reminder about trust and the permanence of online content platforms Last week, Myspace publicly admitted to a huge data loss.It told users: “As a result of a server migration project, any photos, videos, and audio files you uploaded more than three years ago may no longer be available on or from Myspace. We apologise for the inconvenience.” That amounts to a loss of 13 years of user generated content, estimated at more than 50 million tracks.The loss of Myspace data calls into question the notion of trust in open content sharing platforms but data loss is not the only issue. Changing business decisions such as those seen with Flickr and Google+ remind us that what appears to be permanent may not always be so.When is forever not forever?Initially a slow burning story highlighted by Reddit, the news that Myspace has lost its data pre-2015 is as a wake-up call on the reliance and resilience of content sharing platforms.Founded in 2003, Myspace quickly grew as a platform for emerging artists and musicians. Bands such as Arctic Monkeys embraced Myspace to promote their music and grow a fan base before they were signed to a record label. All photos and events on their Myspace page now appear to have disappeared and music files no longer stream.[#pullquote#]this huge loss of data, of cultural heritage, highlights the shifting sands that can underpin such platforms. [#endpullquote#]Whatever the current user base in comparison to the early years of the platform, this huge loss of data, of cultural heritage, highlights the shifting sands that can underpin such platforms.Imposing restrictionsMyspace users are not the only ones affected. Flickr announced in November 2018 that it was removing the 1TB per user on its free accounts, limiting users to 1000 images.Cultural, government and non-profit institutions using Flickr Commons were exempt from these decisions, however not all libraries and special collections use Flickr Commons.Following lobbying from Creative Commons, SmugMug (the owners of Flickr) belatedly revised their position on free accounts.In early March 2019 they came to the welcome decision that all freely licensed images, including creative commons, public domain etc, would be exempt from upload limits. Those users choosing other licenses such as ‘all rights reserved’ would continue to be subject to the 1000 limit.Another tech giant that has had to warn users about the potential loss of data is Google, which recently announced the closure of its social media platform Google+ for consumers as of 2 April 2019. It recently emailed users instructions to delete their accounts and an FAQ detailing what would and wouldn’t be saved.[#pullquote#]It highlights the loss of control over our content when we place it on social media platforms. [#endpullquote#]These are just a few notable examples of social media sites changing during their lifetimes, with a real impact on users at both a personal and institutional level. It highlights the loss of control over our content when we place it on social media platforms. Choosing the channels to promote your digital materialsShould we then stop using social media platforms to promote digital materials for learning, teaching and research?The answer is no, but there are lessons to be learned:1. Know your audienceKnowing your audience is key to deciding where you place content online. Is Flickr Commons, Wikimedia, Twitter, etc. where your audience really is? Do these platforms support your institutional mission?2. Know your rightsCentral to this is an understanding of the terms and conditions of those platforms at a data-in and data-out point. Are you giving up rights to content by posting them? What license is suitable for your content? What recourse do you have if you want to remove content?3. Be conscious that things can changeAll of this is part of the risk assessment at the beginning of the process to post materials on any platform and with it should be an underlying acceptance that things may change in the future. While this may deter some from engaging with social media platforms, posting content online enriches learning and teaching opportunities for all.4. Keep up to date with changes to platformsAs far as longevity of content on social media platforms is concerned, how can you keep abreast with changes to platforms to ensure you can protect your content over the years?There’s no quick way to do this. Companies often keep quiet until they have no choice but to go public, such as the data breach that led to the demise of Google+.The Myspace story has shown that full disclosure about its data loss was over a long period of time, often through individuals asking questions as to where their content had gone before the bigger picture emerged. It would be impossible to monitor content daily, so what can we do?My suggestions would be to:Periodically spot search and check functionality of the content on that platformDirectly query platforms if you spot issuesCheck news updates on the social media platforms themselvesBackup you content and data where possibleYou can find more advice on platform choice and copyright considerations in our guide, making your digital collections easier to discover and our accompanying training course.During the past 15 years social media platforms have become a ubiquitous part of our culture, opening access to content and myriad ways to engage with that content and with each other.We shouldn’t lose our trust in social media platforms; however, we need to acknowledge their potential transient nature and treat them with the appropriate watchfulness.
  • Making AI ‘the best thing ever to happen to humanity’
    How can higher education institutions (HEI) best embrace technology to benefit staff and students? A theme that emerged at Digifest 2019 was the need for humans and technology to support one another. Technology is crucial to the future of education, industry and society - but it’s nothing without humans.This theme came up again and again at our Digifest event last week, whether speakers were discussing skills the tertiary education sector should be nurturing, or highlighting issues facing students and educators and asking how HEIs may work to resolve them.I felt our keynote speakers were particularly strong this year. Joysy John, director of education at Nesta spoke passionately about using technology to offer a broader, fairer and smarter education system. This is about developing human skills, such as communication and problem-solving, while using technology and data to make education more accessible. Human growth is at the heart.Using technology to support staff and studentsA key piece of research, Jisc’s Horizons report, echoes this message. Compiled by the Horizons group – comprising representatives from 30 institutions (HEIs, FEIs and national bodies, together with Jisc), this report outlines a number of strategic challenges facing UK universities and colleges, from finance to cyber security then focuses on the escalating mental health challenge in education.Technology is already playing a role in supporting student and staff wellbeing, with learning analytics increasingly being used to identify students at risk and enable early intervention.In her Digifest presentation, Dr Dominique Thompson – a former campus GP who is now director of Buzz Consultancy for student wellness – addressed possible causes and potential solutions for the increase in mental health issues within HEIs. Concerns such as finances and employability, she believes, are heightened in today’s hyper-competitive, digitised world.[#pullquote#]What message do we send young people when they can have pizza delivered to their door in minutes but have to wait six weeks or more for mental health support? [#endpullquote#]What message do we send young people when they can buy shoes online at 3am or have pizza delivered to their door in minutes but have to wait six weeks or more for mental health support? Dominique stressed the importance of humans and technology coming together to support leaders and students in their mental health.At Nottingham Trent University, for example, a dashboard generates an alert if a student doesn’t engage for 14 consecutive days, allowing tutors to follow up. Online services and apps are also supporting students’ wellbeing, and chatbots are now entering this space too. Bolton College’s chatbot, Ada, for example, responds to students’ wellbeing concerns with links to appropriate online information and contact details for the college’s student support teams.[#pullquote#]meaningful support comes from human beings, and we believe this will be the model in future.[#endpullquote#]In these examples meaningful support comes from human beings, and we believe this will be the model in future. While it’s important and valuable to recognise that technology, such as learning analytics, can help to identify wellbeing and mental health issues early on, it must be used wisely to help the people at colleges and universities understand the problems their students and staff struggle with, and offer timely support.Collaboration will also be crucial. Of course, it will have to be done in a legal and ethical way, but one can imagine the benefits of a world where data was shared between schools, colleges and universities, and across services from healthcare to accommodation, so that key information about a student follows them throughout their education journey.This could highlight potential areas for concern or awareness as young people enter FE or HE, alerting their new institution to support that may be needed.We need to be positive about technologyAnother theme of the Digifest presentations, panel debates and workshops was the need for humans to welcome technology with positivity and optimism.All too often, said Dave Coplin, CEO of The Envisioners, in his keynote presentation, people perceive new technology as a threat – especially AI and robots. We at Jisc believe the emerging technologies of the fourth industrial revolution applied to the academic world will lead to the new paradigm we are calling Education 4.0.This is about doing some things in a completely new way – for example, introducing new immersive learning activities via augmented and virtual reality or via gaming, that could not be experienced in any other way.[#pullquote#]Education 4.0 technology frees up educators’ time to focus on areas where the human interaction will always be key[#endpullquote#]Another key feature will be highly personalised courses and curricula, tuned to aptitudes, aspirations and career paths and giving due weight to wellbeing of the learner. New ‘on the fly’ assessment, based transparently on data, will avoiding plagiarism concerns and stressful, high-stakes assessment activities. And, arguably most importantly, Education 4.0 technology frees up educators’ time to focus on areas where the human interaction will always be key: problem-solving, creativity, emotional intelligence, and motivation.Stephen Hawking warned that powerful new AI will be “either the best, or the worst thing, ever to happen to humanity.” At Jisc, through Education 4.0, Jisc will help ensure that it is the former of those scenarios that prevails.
  • Digital leadership report highlights progress in universities
    In the last few years, the education sector has been evolving at pace to take advantage of new technologies to help save costs, meet rising student expectations, and compete with online learning institutions, and rightly so. As technology adapts at an exponential rate and becomes more and more ingrained in people’s day to day lives, it will continue to be  a key asset that higher education  institutions (HEIs) must make the most of to help with these and other goals. HEIs also have a responsibility to prepare students for the changing job market as we enter the fourth industrial revolution, or Industry 4.0. In our sector, we call this Education 4.0. Enabling student contact with new and emerging technologies as part of learning will help equip them for the workplace and teach them how to adapt to the next wave of digital innovation.[#pullquote#]Enabling student contact with new and emerging technologies as part of learning will help equip them for the workplace[#endpullquote#]But what technologies do universities see as the most important for their organisations, and how are they using these technologies to support organisational strategies?Developed with UCISA, Jisc's new report, digital leadership in HE: improving student experience and optimising service delivery (pdf),  addresses these questions, using a survey of 50 UK university leaders and interviews/focus groups with 25 HEIs.What’s top of wish lists? Our research finds that 68% of respondents identified online learning tools as the most important technology, with artificial intelligence (AI) and machine learning in second place (32%). I believe all the technologies identified could be used to improve experiences and provide greater accessibility for a wider range of students.[#pullquote#]68% of respondents identified online learning tools as the most important technology[#endpullquote#]Indeed, in the report, John Beaver, director of IT services at Bath Spa University, spoke about how AI could help a student with research and module/course selection:"For us, AI is a big interest, both as a technology that we may apply for student experience purposes, such as an AI that might find books of interest in the library for a student knowing what they're doing or recommend particular modules or courses that may be of interest to them."Progress on digital strategies It’s heartening to see HEIs throughout the UK increasingly making progress with digital strategies; the research shows that more than half (53%) have a digital strategy in place, while a further 21% have integrated a digital strategy with other strategies.[#pullquote#]it is important to take a whole-campus approach to a digital strategy before implementing new tools[#endpullquote#]Innovation must have a purpose, and it is important to take a whole-campus approach to a digital strategy before implementing new tools; technology initiatives work much better when aligned with an organisation’s business and teaching and learning plans.Digital leaders in HE must now work out how ‘disruptive’ technologies can be introduced into methods of working in a way that encourages engagement from academic and support staff.Thanks to all of those who were involved in the research of this report. I hope that it encourages university leaders to keep engaging with technology, but also remember it’s only a means to an end, not the end itself.Download the report (pdf)