Is Tablet Technology the future in Supporting Communication?

Lena Negal, recent CCCU BSc Psychology Graduate, used her final year project to look into the effectiveness of new technological advances over traditional paper based methods in supporting communication with young individuals with ASD. Here, Lena shares the results of her study. The participant is referred to here as ‘Felix’.

Is Tablet Technology the future in Supporting Communication?

The final year BSc Psychology students at Canterbury Christ Church University have worked hard this year and showed that their research interests cover a wide spectrum of fields. One of these final year projects was my case study investigating the use of tablet technology to support communication within a low-functioning adolescent with autism.

As technology is advancing so are our means to communicate as a society. This does not exclude people with Autism spectrum disorder (ASD). ASD effects a rising number of people in the United Kingdom and internationally. Currently there are about 700,000 people in England diagnosed with ASD according to the National Autistic Society (2015). As a results of this rising number of children diagnosed with ASD; parents, teachers and governments are desperately looking to find helpful support systems to ensure that this growing population of people with ASD can develop and live to the best of their abilities. To achieve this, psychologists are looking at traditional methods already employed and attempting to give these methods a technological rebirth.

When diagnosing children with ASD the areas mainly assessed are behaviour, communication and social skills. Speech, which can be argued to be the basis of these three areas, is limited or non-functional in 50% of the ASD population (described as low-functioning) (Boesch, Wendt, Subramanian, & Hsu, 2013). To support communication, alternative and augmentative communication (AAC) systems are often utilised. AACs include sign language, modified or simplified versions of sign language, speech generating devices and communication systems based on pictures or other visuals. One of the most commonly used AAC systems is the Picture Exchange Communication System (PECS) (Frost, 2002).

pecs-book-original

Image 1:Photo of Felix’s PECS book

pecs-book-2

Image 2. Photo of Felix’s PECS app


Image 1:Photo of Felix’s PECS book (original name and photo of Felix are covered). The newest app created by the PECS team is called PECS® IV+. It was created to be a digital version of a full PECS book; however, the creators of the PECS underline that the PECS app is a next step in widening communication after a person has mastered the PECS book and is not intended to replace the PECS book (PECS IV+ App: Interview with Dr. Bondy & Frost ,2014).

Image 2. Photo of Felix’s PECS app (original name of Felix was covered) To examine the PECS creators’ prediction, I introduced fifteen-year-old Felix, who had previously used the PECS book as his AAC, to the PECS app. To ensure that communication outcomes were representative of Felix’s use of the PECS devices, the research was conducted with three communication partners in their natural environments. This case study used a 2×3 analysis, reviewing the relationship between the PECS device (PECS book & PECS app) and the communication partners (mother, therapist & teacher). The video observation of the six sessions was coded with INTERACT 14 and analysed on SPSS using a chi-square.

The results showed that the sessions with the PECS book made Felix use significantly more words while the PECS app increased Felix’s use of sounds (which he also uses to communicate). This might indicate that he tried to communicate more with the PECS app by using sounds for meanings he does not have words for. The three communication partners had significant differences in results and therefore the differences between communication partners should be included in future research to ensure real life relevance of outcomes of introducing a new AAC device.

Looking ahead future studies should increase training of communication partners and code their data to get a deeper understanding of the interaction differences. Felix’s family using the PECS app at home and the school’s interest in purchasing tablets to introduce the PECS app to several children with similar needs to Felix, shows that they have seen positive outcomes for Felix. The use of tablets as communication devices for people with ASD could have benefits such as increased vocabulary, increased employability and increased social skills. These outcomes should be reviewed on a bigger scale research and include longitudinal effects.

References Boesch, M. C., Wendt, O., Subramanian, A., & Hsu, N. (2013). Comparative efficacy of the Picture Exchange Communication System (PECS) versus a speech-generating device: Effects on requesting skills. Research in Autism Spectrum Disorders, 7(3), 480-493. Frost, L. (2002). The picture exchange communication system. SIG 1 Perspectives on Language Learning and Education, 9(2), 13-16. National Autistic Society (2015). How does autism affect children, adults and their families? Retrieved April 18, 2016, from http://www.autism.org.uk/About/What-is/Myths-facts-stats. PECS IV+ App: Interview with Dr. Andy Bondy & Lori Frost (2014) Retrieved April 18, 2016, from https://www.youtube.com/watch?v=1v0kmDMwpVI.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Technology and Media in Children’s Development

Suzanne Bartholomew, PhD student in Developmental Psychology shares her view of the Technology and Media in Children’s Development Conference – organised by the Society for Research in Child Development, California, USA.

Using tablets

Technology and Media in Children’s Development
October 27-30,2016, Irvine, California

So here I am at my desk, fresh (fighting jet-lag but feeling like a kid at Christmas) and back from Irvine, California no less. I had been attending a four-day special topic meeting organised by The Society for Research in Child Development (SRCD). Since 1933, this US based society has encouraged multi-disciplinary research on child development and supported the application of such findings in order to benefit both children and families (Hagan, 2000). In the introductory speech, the conference was declared as the SRCD’s first special meeting “SELL OUT” with 100% attendance capacity booked by organiser Stephanie Reich. Not surprising to me as the meeting was titled: ‘Technology and media in children’s development’. Could there be a more relevant topic for conference discussions? (Maybe I carry a slight bias for the topic as a huge section of my research concerns child and parent use of media.)

The timeliness of this conference could not have been more perfect, as The American Academy of Paediatrics (AAP) has recently adapted their recommendations to meet the changing face of media use by younger consumers. Historically, the AAP have stressed the negative aspects of screen time, relying on empirical evidence implying possible interruptions to a child’s typical developmental trajectories. AAP previous recommendations included that parents of children under the age of two years should apply a blanket ban use of all media devices. As we know today, this seems pretty much an impossible guideline for any parent.

childwithphone

With electronic device use, most commonly now the mobile kind, being an integral part of peoples’ lives, it is not any surprise that the UK government (2014) suggests that 87% of UK adults (44.6million) use the internet, up 3.5million since the report in 2011 www.ons.gov.uk. Advances in touch screen technology has enabled younger children to become consumers of this digital world. The newest AAP guidelines acknowledged these changes, scrapping their blanket ban and replacing it with no mention of time limits but instead recommending that online media use for up-to-five year olds should be as much supervised by the parents as the childs’ non-media time (Radesky, Schumacher, & Zuckerman, 2015). Read more details of these changes on the AAP website:


https://www.aap.org/en-us/advocacy-and-policy/aap-health-initiatives/pages/media-and-children.aspx

My mission (if I chose to take it, which I so obviously did; in their right mind, who wouldn’t?) was to fly out to California and carry out a two-hour poster presentation titled ‘Digital effects of touch screen technology on focused attention and executive function’ (EF). The poster incorporated two studies. Primarily, research carried out by Dr Amanda Carr in the Canterbury Christ Church observation labs. The study examined attention in a sample of 18 children aged 10-36 months. Attention levels were recorded before and after both free play and touchscreen tablet play. Results concluded that although there was an overall drop in attention in both conditions, there was no significant difference in attention after tablet play or free play. This suggests that playing on a touchscreen tablet may not have such a negative association with attention span as commonly reported.

The second project was my own work. I looked at the association of screen time with EF and classroom behaviour in an older sample of 8-10 years (N = 276). Correlations hinted at a marginal negative association between screen time and EF skills. However, a larger positive effect was found between screen time and classroom behaviour. We concluded for our samples that, with very young children, screen time had no immediate negative effect on focused attention but, as age and screen time increased, so the effect on EF and classroom behaviour become more pronounced. Both studies have follow up research being carried out as you read.

Presenting both studies to an audience of highly knowledgeable and respected academics was an intimidating prospect that I tried to think about logically rather than emotionally. I was unsure which of the two projects I was more scared about explaining; under pressure to explain Dr Carr’s research 100% correctly, I couldn’t get anything wrong! Or my own, which, surrounded by all these highly educated people, I felt was possibly all wrong anyway. But that was to take place on the Saturday evening, and I had two days of conference before then.

I spent those two days sprinting from one set of talks to another. My conference diary of ‘things to see’ was full from 8.30am until 8pm. The conference offered many different types of talks including keynote speeches, hour long lectures, symposia (a ninety-minute session with three or four researchers giving twenty minute talks followed by Q and A and flash talks), an hour long session with five or six talks in rapid succession with limited Q and A—all offering and delivering valuable insights into contemporary media use research. There was much information I could share here but for space limitations I shall concentrate on reporting on keynote speeches.

Opening the conference proceedings were the principle keynote speakers, renowned psychologists Kathy Hirsh-Pasek and Roberta Michnick Golinkoff with their lecture “Putting the ‘Education’ back in Educational Apps” (Hirsh-Pasek, Zosh, Golinkoff, Gray, Robb, & Kaufman, 2015). The room was full to bursting point as the issues of the development and construction of apps designed for the young consumers were brought up. The conference audience was held captivate (me included) as it was explained how so many apps are being marketed as not just entertaining but also educational.
The psychologists explained that such claims are being made without much consideration of the known psychological processes of child learning and development. They offered us the four pillars of learning: a comprehensive framework designed to enable apps designers and app buying parents alike, the ability to deliberate on the apps associated learning outcomes.

guided-exploration-towards-a-learning-goal

(Hirsh-Pasek, Zosh, Golinkoff, Gray, Robb, & Kaufman, 2015).

Summing up the vast amount of information that was given, the four pillars represented the factors required for learning to take place: active, engaging, meaningful and socially interactive. The added necessity was that all four pillars are embedded with the context of guided exploration towards a learning goal. This framework can be applied to almost any contextual learning situation. However, as a critical element of the digital world, the researchers gave examples of supporting evidence. An example for the social interaction pillar was the teaching of a child via three different social conditions; live video chat between a teacher and the child, the child watching a recording of another child having a live video chat with the teacher and lastly the child watching a pre-recorded lesson from the teacher. Hirsh-Pasek and Golinkoff reported their finding with equal significant learning gain from condition one and two whilst no significant gains made in learning for condition three. The other three pillars were highlighted by research examples and much to the audiences delight the lecture was interspersed with amusing clips of educational apps and not so educational apps.

A full PDF of this lecture can be found at :

http://kathyhirshpasek.com/

Doing a bit of name dropping now, further to the fantastic Hirsh-Pasek and Golinkoff session the conference itinerary of keynotes included:

  • Justine Cassell: Addressing the influence of peers and how this intrinsically dyadic interpersonal closeness could be replicated by a child with a virtual friend.
  • Patricia Marks Greenfield: Appling the theory of social change in a world where digital technology is a key aspect of our culture.
  • Ellen Wartella: Addressing public policies and asking how the media can be responsible for childhood issues such as obesity.

I enjoyed all of the above sessions, but as an admirer (dare I say fan?) I was most looking forward to attending the Sonia Livingstone keynote speech. Professor Livingstone was introduced as an Officer of the Order of the British Empire after having been awarded the OBE in 2014 ‘for services to children and child internet safety’. The audience delighted in this description with energetic claps and cheers erupting around the room. In her talk, she asked the question, “What did I learn from spending over a year following around – at home and school, offline and online – a class of 13 year olds from an ordinary urban London school?” Sound interesting? Yes, it was. Professor Livingstone explained her ethnographic study in great detail, describing how this journey challenged popular inferences about teenagers’ use of the media, outlining the conception of the ‘digital native’. Further, she questioned the assumption of a teenager’s need to be constantly connected to the digital world, suggesting, instead, that a teenager’s need to be able to enter the digital world when they wish to grants them agency. As a parent to three teenagers, this explanation has made me start to rethink my worries about their mobile phones becoming an extension of their hands. I am taking on board their possible need to protect their digital life in doing so protecting their individuality.

The accompanying book ‘A Class: Living and Learning in the Digital Age’ (Livingstone, & Sefton-Green, 2016) is top of my Christmas list, I will ask my children to gift it to me!

(Details of work by Professor Livingstone can be found on the LSE website.

http://www.lse.ac.uk/media%40lse/WhosWho/AcademicStaff/SoniaLivingstone.aspx

And so to the reason I was in the United States: the moment had arrived for this highly stressed but excited researcher to present this ‘well-travelled’ poster (approximately 5,318 miles from London to California). We were on stand number ten; number ten out of forty five! That’s a lot of poster presentations for fellow academics to get through. I nervously stood before our poster wondering if after sitting through ten hours of talks earlier in the day that this would just be too much to ask of my fellow attendees? Maybe this would be one poster session too many?

distorted-photo

(Please ignore the distortion of the picture. The poster board was bent!)

Nope! It seemed not. The room was teeming; to my relief, enthusiasm had not waned. Explaining the research and answering questions to many interested researchers made the session a great success. I had completed the mission, I felt extremely proud of our research, a feeling akin to attending the Christmas nativity play when your child has a ‘talking’ role.

And so here I am sitting at my desk still suffering with jet lag but still feeling like a kid at Christmas and planning to put myself through the conference experience again as soon as possible.

Hagen, J. W. (2000). Society for Research in Child Development. AE Kazdin, Encyclopedia of Psychology, 7.

Hirsh-Pasek, K., Zosh, J. M., Golinkoff, R. M., Gray, J. H., Robb, M. B., & Kaufman, J. (2015). Putting education in “educational” apps lessons from the science of learning. Psychological Science in the Public Interest, 16(1), 3-34.

Livingstone, S., & Sefton-Green, J. (2016). The class: Living and learning in the digital age. NYUPress.

Radesky, J. S., Schumacher, J., & Zuckerman, B. (2015). Mobile and interactive media use by young children: the good, the bad, and the unknown. Pediatrics, 135(1), 1-3.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Nonsense, bullshit and constructive dialogues in Higher Education

Dr. Stavroula Tsirogianni, Social Psychology Lecturer with an interest in values, moral dilemmas and perspective taking talks about her experiences of nonsense, bullshit and constructive dialogues in within academia and Higher Education

In June, I went for the first time to the annual conference of the psychosocial studies to give a presentation on a paper I am writing on with a colleague-friend. Psychosocial studies draws on a range of frameworks like psychoanalysis, critical theory, postcolonial studies, feminist and queer theories etc. The conference, as it is not a mainstream one, was quite small and people were very approachable, which made me really happy to be there. What mostly impressed me about the conference was the chairing style in some of the sessions. Some chairs did a brilliant job in terms of creating an informal, open and conversational atmosphere during their sessions, through arranging chairs in circles or through discouraging presenters to stand up and present using PowerPoint or to talk for more than 10 minutes. It worked. People in these sessions were more open and conversations were more relaxed and more interesting and constructive.

My presentation was on the last day of the conference and normally I would be anxious but this time I felt more relaxed because the setting felt safe. When my turn came and I gave my presentation, I asked the 5 people who attended my session for feedback and ideas on specific things that I felt stuck with. A woman from the audience felt really offended by my ideas, because I mixed mainstream and critical psychological theories to talk about how we construct ourselves as ethical beings through common everyday actions and dilemmas such eating our dead pet dog, eating burgers from KFC or wearing leather shoes etc. She found my ideas to be nonsense. She expressed her contempt by rudely interrupting me, asking me to look at her because she wanted to impart her ‘wisdom’ on me, while a woman from the audience was sharing her interesting experiences of dogs as a black person in South Africa during the Apartheid period. What happened in that session was that she wanted to establish her status as an ‘enlightened’ person, who knows better. I did not take this incident personally as this kind of hierarchical interactions, as most of us know who have been in academia for a long time, are common. If this happened to me 14 years ago when I first got into academia, I would be in tears.

…14 years ago…

When I came to the UK I felt like my perception of myself and the world shattered to pieces. I came from Greece where I grew up in a completely different educational system, where the teacher was the all-knowing figure. I was taught to look at myself and at the world in fixed binaries i.e. right vs wrong, rationality vs imagination, individualism vs collectivism. I was mainly trained to look for the truth, reproduce knowledge and produce outcomes through standardized memory testing. Today, I remember very little about the things I learned in high school and university. My education was based on external authority and drills. There was no space for independent action, divergence of views and ambiguity.
When I first came to London and started my PhD at the LSE, I was completely thrown back by the diversity of people and perspectives. I felt ignorant, exposed and lost. I never talked in seminars or classes and if had to talk to someone senior or someone that I thought knew more than me, my heart was pounding from anxiety. I had internalised so much this hierarchical way of thinking that I constantly felt too inadequate to believe in my own ideas.

Finding myself in such a multicultural and international environment challenged my biases, values and worldviews and brought up questions about authority and systems of power, their effects, how knowledge and identities come to be constructed and challenged. My experience of confusion and loss felt like a personal experience that had to be kept separate from the scholarly and educational process. It was through my exchanges and discussions with my peers and not with my teachers that I started addressing the connection of what I was learning and what I was experiencing.

Doing a PhD was a very confusing and lonesome process. But I was not the only one. A lot of my mates from my year felt the same way. It was this experience of isolation and loneliness that brought us together. We spent a lot of hours having discussions, bouncing back and forth ideas about our work, our plans, our anxieties, our aspirations, our lives, academia, about everything over coffee, beer and cigarette breaks – it is when I took on social smoking which then became regular smoking. I clicked more with some than others. Those who I felt closer to were those who I thought would not judge me for my ideas. Very often, our conversations would get very heated and we would end up arguing and feeling frustrated and defensive, but they were still fun, exciting, informal and above all they felt safe. Safe enough to play with ideas through talking ‘nonsense’. I came to realise that talking ‘nonsense’ is important in the process of elucidating thoughts and ideas.

Sadly from my experience in academia during the past 14 years, this type of academic exchanges are quite rare in the formal academic settings of seminars, meetings, symposiums and conferences even in our classrooms. Scholarly dialogues are usually formal, lack excitement and tend to be competitive. Intellectual conversations take the form of wars between egos. There are always people in the audience, who think of themselves as ‘enlightened’ and see their ideas as better than others’ even if their area of expertise is not related to what is being discussed. The aim of such exchanges is to discredit the speaker and to find holes in arguments. Of course as academics we are passionate about what we do, and we do get attached to our ideas as, which we try to protect and defend as our ‘babies’. Even the language that we use to argue about our ideas or to describe our experiences of conversations with colleagues reflects the aggressive nature of academic dialogues. We often use phrases like, this person ‘attacked me’ or ‘attacked my views’ or ‘shot down my argument’. Even the PhD viva is called a ‘defence’.

While being critical is very important part for advancing science and knowledge, the critical view is often associated with justifying theories, providing answers, finding holes in arguments and focusing excessively on details, on a small aspect of an argument. In my view this type of criticism fails to take a discussion to new directions, open new perspectives and generate new questions. Criticism in this context becomes unsafe and threatening since it prioritises cognitive closure, the quest for truth, a convergence of ideas, shutting down the dialogue rather than divergence of views, complexity, collaboration and opening up the dialogue (It is worth reading Alfonso Montuori ‘s work on ambiguity and creativity).

Going back to the idea of nonsense, nonsense is a very important ingredient in critical thinking and imagination. Of course there are different types of nonsense or bullshit (if you are interested in the topic, it is worth checking out Harry G. Frankfurt’s short philosophical essay ‘On bullshit’). According to Frankfurt, there is nonsense that its main motive is pretentiousness and aims to make an argument that suits one’s own purpose and agenda; and then there is nonsense that does not aim at a specific goal. The second kind is related to the concept of play. This type of nonsense allows us to play with ideas, make sense of them, tolerate and explore ambiguity, imagine and generate different scenarios, questions and answers. From a developmental perspective, Vygotsky was among the first psychologists to talk about the importance of play in children’s emotional, social and cognitive development and its contribution to the development of our unique human ability for symbolic representation such as imagination.

The reality is that for academics as well as for students, our passions emerge, develop, evolve and are expressed within certain economic and institutional contexts, which make the concept of play sound ridiculous. I am not trying to promote here a romantic view of academia and universities, where we need to spend hours staring out of our windows into the horizon or at the stars talking bullshit. Although I quite enjoy doing that with friends when I get the chance… I am also not claiming that there is not enough creativity in academia. However, if universities and academia are learning communities where old ideas, theories and assumptions about the world are questioned and new ideas emerge, there needs to be a space for both convergence and divergence of ideas and idea selection.

A learning community is a place of possibilities. I believe that the theories and the research we discuss represent the best possibilities for understanding at the world only at present, but will be replaced by alternative possibilities in the future. It is important to learn how to give up old ideas about our courses, theories and research and unlearn ways of examining and looking at the world.

The same way that societies are not a comforting ‘melting pot’ where different groups peacefully co-exist and agree to disagree with each other, the same goes with academia. Classrooms, seminars, conferences, meetings are not always safe and harmonious and it would be a fantasy to even try to make them harmonious places. All knowledge is constructed against different histories of antagonisms and misuse of power.

However, it is important to try to foster an environment in our classrooms, meetings, symposia that is collaborative and exploratory. Academic inquiries are not only individual mental processes. They are collaborative and science is becoming increasingly collaborative. If we take into account the growing rates of anxiety and depression among students and academics in universities today, the need for community becomes even more important. Generative dialogue and collaboration and not ‘cutthroat’ cooperation are also key skills that our students need to develop in order to become more able to resolve problems when they leave university. A recent meta-analysis of 168 studies across 51, 000 employees in different industries found that leaders reward people who are interested in the success and welfare of the team and organisation rather than themselves only.

The classroom and academia are places, where students and teachers learn how passionate engagement with ideas can lead to conflict, confusion and mistakes. Academia is also a place where like any other place, where we create, accumulate and use knowledge to defend certain positions. Many times we get stuck in these positions, because they feel safe, as they protect our egos and our privileges. However, especially in light of the current climate in higher education, we need to start fostering collaborative contexts, where the discovery and not the justification of positions and ideas is rewarded. We need to cultivate a culture that promotes a view of human beings, the world and knowledge as evolving not as fixed entities. A view of knowledge as an ongoing and evolving inquiry about ourselves and our discipline that at different stages generates answers but also more questions could start to help us reconcile with the idea that it is ok to let go of our ideas, assumptions and positions.

1 – https://www.theguardian.com/education/2016/sep/23/university-mental-health-services-face-strain-as-demand-rises-50
2 – http://psycnet.apa.org/?&fa=main.doiLanding&doi=10.1037/a0013079

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Comfort Dogs and the Spiritual Experience

Nicole Holt, research assistant to Dr. Liz Spruin, has been exploring the use of dogs for rehabilitation and wellbeing, and the benefits that ‘comfort dogs’ can provide

In America, a new role for dogs has been created called “comfort dogs”. These dogs and their handlers regularly go out into the community and interact with people at churches, schools, nursing homes, hospitals, and various social events throughout the community. The label “comfort dog” was suggested by the Lutheran Church Charities organization in Northbrook, Illinois, going on to write that ‘[comfort dogs] share the mercy and compassion of Jesus Christ with…people’.

While compassionate engagement with the community is commendable, some have argued that this work focuses on vulnerable individuals, particularly when the same individuals might be asked for financial donations. There is a wider issue about the safety and welfare of the dogs involved, but so far no substantiated reports of (mis)treatment have emerged.

However, though it is important to note criticisms of the scheme, we should also consider some of the benefits. For instance, these dogs are also used all around the United States to help people in disaster response situations. One also has to applaud the creative use of such animals, as dogs can be a calming influence on individuals who are in need of comfort. It might also be the case that the dogs are able to gain comfort for the people whom they are helping, leading to a mutual benefit.

Moreover, there have been several case studies, alongside anecdotal evidence, demonstrating that engaging with dogs can help reduce stress and anxiety (Holder, 2013; O’Neill-Stephens, 2011; Wells, 2009). Also, many people, for one reason or another, are not in a situation that allows them to own dogs or simply interact with animals. So, just for a few hours, people can have the benefit of connecting with dogs, which contributes to health and well-being, helping people feel connected with a creature who makes no judgement. As long as a religious organisation is subtle rather than forceful in presenting their own beliefs, and are keeping compassion at the centre of their interactions, then it is easy to see the positives in this initiative.

As we rapidly push forward implementing using dogs in the courtroom, ‘comfort dogs’ provide food for thought about the many roles which dogs can potentially play in society.

To find out more about comfort dogs please follow this link to their website: Immanuel Lutheran Ministries

References

Holder, C. (2013) ‘All Dogs Go To Court: the Impact of Court Facility Dogs as Comfort for Child Witnesses on a Defendant’s Right to a Fair Trial’ Hous. L. Rev., 50, pp.1155.

O’Neill-Stephens, E. (2011) ‘Courthouse Facility Dogs.’ [Online] Available at: http://www.courthousedogs.com/starting_news.html (Accessed 14th Septemeber 2016)

Wells, D. (2009) ‘The effects of animals on human health and well‐being.’ Journal of social issues, 65(3), 523-543.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Artificial Intelligence in Teaching: The State of the Art

Our technician Richard Weatherall attended a talk on Artificial Intelligence in Education (UCL Knowledge Lab/Pearson) on 24th March

The term artificial intelligence (AI) causes a wide range of different associations for people, usually depending on the level of experience or exposure an individual has to such systems. Many still hold the opinion that AI will herald the end of humanity, believing that its advent will make humans redundant, obsolete, or it will even become advanced to a state where it may decide it has no need for humans and begin to act accordingly.

Fear-mongering aside, AI is already becoming embedded in our society, with companies such as Apple, Google and Microsoft all deploying their AI personal assistants (Siri, Now and Cortana) as standard with their products. Historically, there has always been resistance to a paradigm shift such as this, and while it is true some human jobs may be replaced with a mechanoid or synthetic counterpart, the opportunities created are often so vast and diverse even the most prophetic cannot envision the direction of the future unraveling before them.

The International Artificial Intelligence in Education society is concerned with the research and development of the implementation of Artificial Intelligence for learning; and believes well-designed AI, in collaboration with teachers, parents and learners is paramount in maximizing the future benefits AI can offer, which are vast. A recent talk hosted by the UCL Knowledge lab, London, in collaboration with Pearson, highlighted current research in the area, as well as a demonstration of AI currently being deployed to assist learning.

The talk began with a presentation of a review of various meta-analyses looking at use of a number of AI systems/programs in a classroom environment. Results varied slightly between studies, but overall gave a positive view of AI integration. Notably, intelligent teaching systems performed as well as real teachers in one to one (non-expert) tuition. One common theme however, was that despite the varied and positive ways AI systems have been implemented, it was found that there is no AI substitute for classroom experience, with AI systems greatly aiding the teacher, but not being able (or intended) to replace them.

The need for properly developed teaching aids is more important than ever with the government’s plan to academize schools, possibly resulting in a lack of regulated experienced teachers and a focus on the financial bottom line. The bottom line for most teachers, however, is that teachers love to teach; and technology must enhance, enable and empower teachers to this end, which is the goal of AI in Education.

Whatever your opinion of AI, it’s widespread development and use only appears to be increasing. AI therapy programs, for example, are already being deployed to aid mental health in remote areas and populations isolated by war. As computational models of human emotion become more sophisticated and machine learning ever better at detecting the mental state of it user, the need for proper theoretical and experimental data driven designs are paramount. As one guest involved in computing and psychology at the AI.ED talk noted, computer programmers are not inherently psychologists, and psychologists not trained as programmers – a knowledge gap which must be closed in order that AI be safely and productively deployed with maximum benefit for the good of human kind.

Intelligence unleashed: An argument for AI in Education is a publication produced by ULC Knowledge lab and Pearson to inform and educate about the current state of AI.

Some examples of current AI systems being researched and implemented in classrooms, and presented at AI.ED, include Betty’s Brain, italk2learn, Zondle, The Tardis Project and (Whizz education)[http://whizz.com].

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

How social psychology will change your life

The typical CCCU Psychology student is drawn towards clinical, health, forensic, and/or educational psychology. But the sub-discipline of psychology that I love is social psychology, because of its relevance to everything that goes on in the world. Let me give you two examples:

In March 2015, the co-pilot of a German airliner crashed his plane in the French Alps, killing all passengers and crew. Media coverage of this tragic event quickly centred on his mental health and the security provisions on board. But, as if the event itself had not been shocking enough, the reader comments added to various online articles were soon full of speculation about a catastrophic safety failure on an aged and poorly maintained aircraft, now systematically covered up by the airline and the investigating authorities in France and Germany. An obvious but seldom asked question about this event is thus: Why were people so ready to believe these complex and sinister ideas, when the official investigation had suggested the (subsequently confirmed) suicide and murder early on?

There is a young but growing body of social-psychological literature about conspiracism, some of which features in our second-year module, Influences on Social Functioning. This literature suggests, for example, that people may believe conspiracy theories because big events prompt them to seek big explanations (Leman & Cinnirella, 2007). It also supplies some evidence of projection processes, whereby people will tend to believe in actions that they would be willing to take themselves (Douglas & Sutton, 2011). A possible – unproven – explanation for the belief in conspiracy theories around the 2015 airline crash is therefore that people could imagine complicity in a cover-up but not in the suicidal intentions of a single person causing the tragic deaths of so many.

The second example of how social psychology offers a different perspective on current events is very current indeed: The British EU referendum on 23 June 2016 will determine whether the UK leaves or remains a member of the European Union. Throughout the debate about a potential “Brexit”, I have been struck by speakers’ attempts to focus on economic arguments, when the issues so obviously involve identity, solidarity, power, and nationalism. As will be well known to students of our third-year module, The Psychology of Nations, there is evidence from discourse analytic studies that English interviewees may avoid talking about national pride for fear of appearing prejudiced (e.g. Condor, 2000), but commonly use references to “being an island” to highlight distinctiveness from other European nations (Abell et al., 2006). The way people feel about national and European identity does not seem to be well represented in the political arguments about the referendum, and important complexities in precisely these areas seem to be in danger of being overlooked prior to such an important decision. A quantitative study by Cinnirella and Hamilton (2007), for example, found significant negative correlations between British and European identity (r = -.25) and between British identity and attitudes towards Europe (r = -.46) among white British participants, but positive correlations between the same measures (r = .74 and r = .41, respectively) among British Asians. Perceived compatibility between Britishness and Europe is obviously variable and deserves to be part of the debate.

So here is my promise and challenge to you: Social psychology will help you think differently about current affairs. When following the news, try to apply this perspective and think about what it adds to your understanding. Feel free to send me your ideas – I’ll be interested in hearing about them!

References

Abell, J., Condor, S., & Stevenson, C. (2006). “We are an island”: Geographical imagery in accounts of citizenship, civil society, and national identity in Scotland and in England. Political Psychology, 27(2), 207-226. doi:10.1111/j.1467-9221.2006.00003.x

Cinnirella, M., & Hamilton, S. (2007). Are all Britons reluctant Europeans? Exploring European identity and attitudes to Europe among British citizens of South Asian ethnicity. Ethnic and Racial Studies, 30(3), 481–501. doi:10.1080/01419870701217530

Condor, S. (2000). Pride and prejudice: Identity management in English people’s talk about “this country”. Discourse and Society, 11(2), 175-205. doi: 10.1177/0957926500011002003

Douglas, K.M., & Sutton, R.M. (2011). Does it take one to know one? Endorsement of conspiracy theories is influenced by personal willingness to conspire. British Journal of Social Psychology, 50(3), 455-552. doi: 10.1111/j.2044-8309.2010.02018.x

Leman, P. J., & Cinnirella, M. (2007). A major event has a major cause: Evidence for the role of heuristics in reasoning about conspiracy theories. Social Psychological Review, 9(2), 18-28.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Read all about it!

As we come to the end of the 2015-16 academic year, it’s time to update you on our recent news and events. Our latest CCCU Psychology Newsletter introduces our new Director of Psychology – Dr. Amanda Carr. We also cover: highlights from “Psychology has talent”, student stories, research project updates and more. We hope you enjoy keeping up to date with CCCU Psychology. Don’t forget you can also connect with us via Facebook (CCCU Psychology), Twitter (@CCCUPsych) and LinkedIn (CCCU Psychology). Finally, good luck to our current students for the last round of deadlines and exams!

Download the newsletter now!

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

The Horse as a Therapist’s Assistant

Philosophers and practitioners alike have recognised that humankind has had a long and enduring relationship with all things natural. Jung had suggested that our over-civilised selves could do with some re-wilding! One route to this is to re-connect with our animal brethren, and for those who have suffered psychological trauma, a more specific approach is through animal-assisted therapy. While many animals provide comfort at a simple psychological and physiological level as ‘companions’, others elicit different responses and experiences. Compare, for instance, being in the company of a whale or goldfish. In the company of a large animal, it is possible that a sense of awe triggers regression to development periods that reflect similar relational experiences during childhood, which enables a process of resolution and self-development. Moreover, people with severe trauma or those for whom talking is difficult, such as young children and people with autism, non-linguistic and embodied communication may be more important. Animals are able to ‘read’ these signals and respond and interact in a way that is perceived by many as safe and therapeutic.

pastedImage

In particular, the use of horses for aiding recovery from trauma, typically Equine Assisted Therapy, has been gaining some interest and practice recently. A forthcoming chapter in a book addressing various forms of outdoor therapy, Ecotherapy: Theory, Research and Practice edited by Martin Jordan and Joe Hinds, details the ideas, benefits and practice behind Equine Assisted Therapy.

pastedImage

The horse encourages people to act and behave in a way consistent with their actual thoughts and feelings (congruence), which may potentially be overlooked or under-developed within the counselling room. The horse responds with honesty and without a hidden agenda reacting to the internal world of the person, regardless of their efforts to conceal it.

Direct experiences with a large animal can produce a sense of perceived mutual understanding – there is a substitution of verbal cognitive communication for a basic or primal communication that is symbolic and directly dictates behaviour. The human response to a large animal will often be attuned to its physical presence and its movements and may, through the tactile, embodied and physical quality of the animal encounter, enhance important unexpressed emotions and build relational aspects of the self that have been thwarted or under-developed. Martin Clunes the actor experiences this directly in the video below:

In short, these experiences can prompt authentic moments, whereby the statement ‘actions speak louder than words’ has added significance.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Fancy an Insect?

Last week’s evolutionary psychology seminar was on the topic of emotions, and why we have them. There are six basic human emotions (and some of them are shown in many animals as well). They are happiness, sadness, anger, surprise, fear, and disgust. Across the globe, people of all cultures and backgrounds produce the same facial expressions for each of these, and they are universally understood. In our seminar, there were very few happy faces (it was Friday afternoon in January…), but not too many sad, angry or scared people. One of the papers we discussed was on the reasons disgust would be useful for our survival. To demonstrate the feeling and to evoke the facial expressions, students were offered a dish with dried mealworms and crickets (produced for human consumption…).

I am making a disgusted face as I am typing this. But why would a nutritious, free snack be so off-putting? Insects are high in protein and minerals and, therefore, a valuable treat. In many countries, insects are a staple food (the eating of insects is called entomophagy). Interestingly, these insects are all vegetarian. Insects that feed on contaminated matter, such as excrement, dead meat or blood, are not used as human foods.

For a person with a European upbringing, the eating of insects is unusual (I am not talking about the flies swallowed by cyclists, or aphids eaten with a salad). But why, when we can easily overcome unfamiliarity with new foods from different parts of the world (sushi, anyone?), do we have so much trouble with eating insects?

We all show disgust. It helps our survival by making us avoid from things that might make us sick. We are disgusted by body excretions and parts, slimy, damp and stinky items, and rotten food. And if you dare to think about what might be an indicator that food has gone off, or what wiggles in excrements, you will come up with worms. They are also slimy and damp. They trigger our ‘do not eat!’ alarm. And therefore we show disgust on our faces, just by thinking about the kind offer of a dried mealworm with our tea. Humans from insect-eating cultures will be equally disgusted by our European habit of eating meat stuffed into the colon of animals (sausages), or sheep intestines encased in the animal’s stomach (haggis). But, overall, the emotion of disgust serves an evolutionary purpose, to enhance our survival. Basically, hygiene is in our genes.

(Mealworms don’t taste of much, but leave a lingering, rather unpleasant aftertaste. And no, I didn’t try the crickets.)

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Oscar unable to convince judges “it’s raining”

In a country where 50 people are murdered every day, with only half of all murder cases being sent to court, and a shocking 12% of which result in a guilty verdict (South Africa Statistics Association, 2015), South Africa is considered the ‘the most murderous society on earth’ (Nedcore Project, 2015). The South Africa Statistics Association (SASA) recently revealed that next to war torn countries, the high murder rate makes this country the worse conflicted area in the world (McCafferty, 2015). The statistics speak for themselves, it is clear that the South African Criminal Justice System is failing. Arguably, that was until yesterday… perhaps things are starting to change in a society that is deemed to have the highest rates of violence against women in the world (Faul, 2013).

For over two years, millions of people across the world have watched the South African Justice System unfold in the Oscar Pistorius trail. Prosecution and defence have traded blows over what happened the night Oscar Pistorius killed his girlfriend, Reeva Steenkamp. The reason as to why Pistorius shot four deadly blows through a locked bathroom on 14 February 2013 is likely to remain a mystery. Whether Pistorius knew it was Steenkamp behind the bathroom door, or an intruder, will perhaps be debated for years. But the shock that exploded across the world when Pistorius was convicted of culpable homicide, spending only one year in prison from taking an innocent life, was uncontested. Indeed, in one survey only 7% of the British public believed him to be innocent (Dalgreen, 2015).

In my opinion, yesterday, the courts in South Africa have made an unprecedented leap in the justice system, when Pistorius’ original verdict was overturned and replaced with murder. This conviction (which carries a minimum of 15 years in South Africa) comes unanimously from five judges, stating that Pistorius ‘never offered an acceptable explanation’ for firing four times through closed doors. Although we may never know exactly what happened the night Steenkamp was killed, one thing is certain, the logistics of Pistorius’ account is by all means a far stretch of the imagination. Perhaps Judge Judy summed it up best when she stated that defendants provide such elaborate stories…‘don’t pee on my leg and tell me it’s raining’.

Facebooktwittergoogle_plusredditpinterestlinkedinmailby feather