In Defence of the Scientific Method

I recently listened to a podcast interview with Dr Adam Gazzaley, a neuroscientist and Director of the Gazzaley Lab at UC San Francisco.  While the work of Dr Gazzaley is both interesting and practical, the real take away for me from the podcast was to reconfirm my commitment to the scientific method. This is not to be mistaken for a belief in science, which throughout recent years I have become more and more disillusioned with. Rather, it is to avoid any notion of chucking the baby out with the bathwater and make clear the distinction between the flawed practice of science and the body of techniques that comprise the scientific method.

The scientific method dates back to the 17th century and involves the systematic observation, measurement and experimentation, and the formulation, testing and modification of hypotheses (cf.  https://en.wikipedia.org/wiki/Scientific_method). While not wishing to go into the history of the development of the scientific method, the applications of these principles have since been the basis for societal development. The refinement of this thinking, by the likes of Karl Popper, together with a multi-disciplinary approach with the appropriate use of logic and mathematics, is central in our search for truth (using the term loosely).

The problem is that the discipline of science is continually letting itself down. It allows itself to be compromised, therefore tarnishing the scientific method in the process. This issue has been discussed extensively in this blog. A recent article in the Times Higher Education added to the growing body of work questioning the usefulness of published scientific research.

The failure of science to live up to its own ideal allows detractors to question the scientific approach and replace this with opinion, faith and case studies of an n=1 to be the new standard of knowledge. This is such a terrible shame as it detracts from one of the true breakthroughs in the human story.

The message of this blog is simple. The practice of science in modern society deserves to be questioned and the failure is systematic from universities (refer to previous post) through to publications (refer to previous post). This failure is, in the main, a by-product of the commercialisation of all things science. However, rather than being the fault of the scientific method, it is the scientific method that provides a lens by which these failures can be seen most clearly.

Having been involved in building a business in I/O psychology, I understand the role and requirement of marketing. Understanding this is a key part of partner and staff induction. If I/O psychology is going to progress however our discipline must still adhere as much as possible to the scientist/practitioner model (refer to previous post), guided by adherence to the scientific method.

Humanity owes a lot to the scientific method. We must never allow this to be forgotten during the critique of the current practice of science. The method must be given the respect it deserves. To paraphrase Dr Gazzaley, this means that to be based in science is not our benchmark. Adherence to the scientific method with rigour and objectivity, noting all limitations, is the litmus test by which we need to measure ourselves as I/O practitioners.

Posted in Uncategorized | Tagged , , | Leave a comment

Is Competition good for Science?

I have been a strong supporter of Capitalism. I believe in free trade, unbridled competition, and the consumer’s right to make choices in their self-interest. Laissez-faire capitalism, and the competition that it breeds, I often see as key to well-functioning economies and competition is essential to good long-term solutions without exception.

As noted I have held this view for a long time, and without exception, but recently I have been deeply challenged as to whether this model is applicable to all pursuits. In particular I am questioning whether competition is truly good for science.  This is not a statement I make lightly and is made after much reflection on the discipline and the nature of the industry I work, both as lecturer and a practitioner of I/O psychology.

There is a growing uprising against what many perceive as the management takeover of universities. This open source article ‘The Academic Manifesto’ speaks of this view and its opening paragraph captures the essence of the article:

“… The Wolf has colonised academia with a mercenary army of professional administrators, armed with spreadsheets, output indicators and audit procedures, loudly accompanied by the Efficiency and Excellence March. Management has proclaimed academics the enemy within: academics cannot be trusted, and so have to be tested and monitored, under the permanent threat of reorganisation, termination and dismissal…”

While I can certainly see efficiencies that can be made in universities and that the need for accountability is high, I can’t help but agree with the writers that the current KPIs don’t meet the grade (no pun intended). The ‘publish or perish’ phenomena works counter to producing quality research that is developed over the long-term.

Competition also leads to a lack of valuable, but not newsworthy, research. This topic has also been discussed previously in this blog (the-problem-with-academia-as-a-medium-of-change-or-critique), but the key issue of replication that is at the heart of our science is sorely lacking (Earp BD and Trafimow D (2015) Replication, falsification, and the crisis of confidence in social psychology. Front. Psychol. 6:621).

We have created new terms such as HARKing that describe how we have moved away from hypothesis testing, which is central to science, and into defining hypotheses only after the results are in (Bosco, F. A., Aguinis, H., Field, J. G., Pierce, C. A., & Dalton, D. R. (in press). HARKing’s threat to organizational research: Evidence from primary and meta-analytic sources. Personnel Psychology.)

Likewise the increased growth in universities, and the competition between them, without a growth in jobs is being questioned in many countries. When a degree simply becomes a means to an end, does it provide the well-rounded educated population that is required to have a fully functioning progressive Society?

At a practitioner level, the folly of competition is perhaps most apparent in the likes of psychometric testing; an industry I’m acutely familiar with. Test publishers go to great lengths to differentiate themselves so as to carve a niche in the competitive landscape (are-tests-really-that-different) . This is despite the fact that construct validity, which is the centre piece of modern validity theory, in essence requires cross validation.  The result is a myriad of test providers sprouting the “mine is bigger than yours” rhetoric at the detriment of science.  Many times users are more concerned about the colour used in reports than about the science and validity of that test.

Contrast this with a non-competitive approach to science. The examples are numerous, but given the interest in psychology take, as an example, the Human Brain project. Here we have scientists collaborating around a common goal towards a target date of 2023. 112 partners in 24 countries and the driver is not competition but the objective itself of truly expanding our knowledge of the human brain.

We have the US equivalent which called the Brain Initiative and there is further collaboration to create the combined efforts of these two undertakings. With the advancements in physics that has given rise to brain scanning technology, we now understand more than ever about the processes of the mind. This simply would not be possible under the competitive model applied to science.

My experience as a practitioner selling assessment and consulting solutions, as a lecturer who has taught across multiple universities and as a general science buff, have led me to see the downside of competition for science. Competition still has a place in my heart, but perhaps like chardonnay and steak their value may not always be realised when combined.

Posted in I/O Psychology, Scientist - Practitioner | Leave a comment

Learning agility: where wisdom meets courageous problem solving

learning agilityThe Iliad is the earliest piece of Western literature and illustrates the generally distinct characteristics of wisdom versus problem solving with risk and courage. King Nestor the wise might miss opportunities for gain due to his caution, but is renowned for eventually making great decisions based on his judgement, knowledge, and experience. While Odysseus has a great ability to courageously solve problems in circumstances of extreme risk, but more often than not gets himself into such situations due to his own lack of wisdom!

The title of this blog suggests that learning agility bridges this gap between Nestor’s wisdom and Odysseus’s courageous problem solving.  So what exactly do we mean by “learning agility”? While the ability to learn can be broadly defined by one’s ability and willingness to do so, learning agility concerns the speed with which people learn and the flexibility with which they apply that learning.  A hallmark of the agile learner is their ability to learn from previous experience and apply that learning in current situations, often in creative or unique ways.  Sounds wise right?

Yet agile learners do more than learn from their previous experiences. They also have Odysseus’s panache when it comes to putting themselves into challenging/uncertain situations. Like our Greek Hero they are sufficiently present and open to challenge the status quo, remain calm in the face of adversity, and be open to testing alternatives. In doing so, they are able to courageously seize opportunities and turn adversity to their advantage.

So what leads to differences in learning agility? One of the key ingredients is the mindset that facilitates or inhibits a leader from demonstrating the behaviours associated with continual growth, development, and the use of new strategies that will equip them for the increasingly complex problems they face.

If people have a fixed mindset, they believe their basic qualities, like their intelligence or talent, are simply fixed traits. They spend their time documenting their intelligence or talent instead of developing them.  For this reason they are less likely to reflect on and learn from failures as they see them as an indication that the limit’s of one’s ability was reached. For this reason they are also less likely to be comfortable putting themselves into challenging/uncertain situations or trying out new strategies that entail risk of failure.

If on the other hand they have a growth mindset, they will believe that their most basic abilities can be developed through dedication and hard work—brains and talent are just the starting point. This view creates a love of learning and a resilience that is essential for the growth and development associated with great accomplishment and learning agility.  They are comfortable risking failure because getting it wrong just presents an opportunity to further learn and grow.  It is in such fertile soil that the seeds of problem solving grow.

Furthermore, decades of research into neuro-plasticity has taught us that those with the growth mindset are right. Your talents and abilities aren’t fixed and can be developed through experience and opportunity. Yet even the most growth mindset oriented amongst us can have difficulty remaining calm in the face of adversity and making risky decisions when the stakes are high. This is where our courage quotient comes to the fore in the shape of courageous problem solving.

When we think of courage we often think of the big examples; examples such as Victoria Cross winners or those who otherwise risk life and limb in the process of saving another or doing what’s right. Yet personal courage is something we all have the opportunity to demonstrate on a day-to-day basis and build and develop as a function of practice and application.  Personal courage is about acting when we experience anxiety due to uncertainty or risk – defining features of the contexts in which learning agility is displayed. Fortunately most of our decisions do not run the risks experienced by Odysseus, such as being turned to stone, lured into crashing our ship onto rocks, eaten by monsters, or sucked into whirlpools. Yet the brain often treats the risks we do encounter (e.g., risks to status, autonomy, and the approval of others) in much the same way.

So all we need to do to increase our learning agility is combine the best characteristics of two Greek heroes! We need Odysseus’s courageous problem solving with Nestor’s wise experience-based judgement.  Fortunately learning agility isn’t something you have or don’t have, but is instead something we can all develop and grow in rather less epic circumstances than the Iliad.  Some of the things we can all do in our daily lives to increase our learning agility are:

  • Seek challenging feedback
  • Take action when we experience anxiety and there is some element of uncertainty or risk (i.e., exercise person courage)
  • Reflect on what worked well and didn’t in different situations, and think about what could have been done differently
  • Ask questions to understand without the need to be understood
  • Try to identify and challenge some of the basic assumptions underlying our usual way of seeing things

Fortunately for us mere mortals OPRA facilitate a learning agility workshop wherein participants build their capacity to demonstrate these actions. Participants also gain insight and tools concerning the mindsets, self-talk, motivations, and mental framing associated with learning agility.

If you would like to discuss how OPRA can support your learning and development with proven, researched based workshop for enhancing your ability to wisely demonstrate courageous problem solving (i.e., learning agility!), then please contact your local OPRA office:

 

Wellington: 04 499 2884 or Wellington@opragroup.com

Auckland: 09 358 3233 or Auckland@opragroup.com

Christchurch: 03 379 7377 or Christchurch@opragroup.com

Australia: +61 2 4044 0450 or support@beilbyopragroup.co.au

Singapore: +65 3152 5720 or Singapore@opragroup.com

Posted in I/O Psychology | Tagged , , , , , , | Leave a comment

The Adaptive Skills and Behaviours Required to Succeed in Future Work Environments

There is a lot being said about the future of work, and what this means for the type of skills, attitudes, and behaviours we will require to succeed.  With this future already upon us, it is important that we pick up our pace  of change, and look to build capability that helps us to adapt, thrive and succeed within an ever changing world.  Best selling author, Jacob Morgan, describes in his latest book ‘The Future of Work’ five trends shaping the future of work;

  1. New behaviours
  2. Technology
  3. Millennials
  4. Mobility
  5. Globalisation

These trends are bringing a dramatic shift in attitudes and ways of working; new behaviours, approaches, and workplace expectations.  Whilst many of us are sensing these rapid changes, we aren’t necessarily sure why these changes are happening, what they mean, or how they will impact us.

As Jacob Morgan says:

“The disruption of every industry is also causing a bit of unrest as people struggle to define where they fit or if they will be obsolete.  It’s forcing us to adapt and change to stay relevant while giving rise to new business models, new products, new companies, new behaviours, and new ways of simply existing in today’s world”.

So, the burning questions are:  what exactly do these changes look like for employees, managers, and organisations?  And, what skills, attitudes, and behaviours do we require to succeed?

What we do know is that modern employees are more self-directed, collaborative in their approach, and want to shape and define their own career paths instead of having them predefined for them.  They are continually seeking out learning opportunities that fit with their personal purpose and professional aspirations, and are looking for development opportunities that benefit them holistically as a ‘whole person’.  They seek the skills, confidence and healthy mind-set to challenge the status quo, to think on their feet, and to continually adapt within highly fluid and ever changing organisational environments.  They are looking to learn and develop emotional and social intelligence;  to work within increasingly networked communities;  to lead, collaborate, innovate and share.

Consistent with the above is five crucial behaviours, identified by Morgan, as being required by employees in the modern workplace;

  1. Self-Direction and Autonomy – to continually learn, and stay on top of important tasks within manager-less organisations
  2. Filter and Focus – to be able to manage the cognitive load associated with increasing amounts of pervasive information
  3. Embracing Change – to continually adapt to new working practices whilst demonstrating resilience and healthy mind-sets
  4. Comprehensive Communication Skills – to support collaborative work practices, and to communicate ideas and provide feedback succinctly
  5. Learning to Learn – to be willing to adopt a pro-learning mind-set; to step outside comfort zones, reflect, and make meaning of experiences.

Organisations also need to adapt to the future of work to support these trends and demands, and ensure they are attracting, developing, and retaining top talent.  A good place to start is by fostering and embracing the principles of organisational learning.  Peter Senge suggested in his book ‘The Fifth Discipline: The Art of the Learning Organisation’ that in order for an organisation to remain competitive within the complex and volatile business environments that we find ourselves operating they must build their capacity for continually transforming.  This involves developing cultures that;

  • Encourage and support employees in their pursuit of personal mastery (the discipline of continually clarifying and deepening our personal vision, and seeing reality objectively)
  • Encourage employees to challenge ingrained assumptions and mental models
  • Foster genuine commitment and enrolment through shared visions.

Here at OPRA we are developing a carefully selected set of best-of-breed, soft skill learning and development programmes to help individuals and organisations embrace these current and future trends. Our programmes are designed to equip professionals with the emotional intelligence, healthy thinking, learning agility, collaborative team behaviours, and motivation required to demonstrate exceptional performance within the modern workplace environment.  We have grounded our programmes on the principles of positive psychology, and an understanding that REAL learning and engagement only occurs when self-awareness, participation, and a tangible sense of progress are present. Therefore, and in light of this, all our programmes are designed to;

  • Develop self-insight and raise awareness of individual and collective strengths
  • Utilise proven research based content, delivered by expert and accredited practitioners
  • Provide access to on-going professional coaching opportunities to further deepen learning
  • Incorporate social learning methodologies to encourage and enable collaboration and sharing
  • Provide applied on-the-job challenges and reflection to embed and sustain behavioural changes.

Watch this space for further announcements about OPRA Develop over the coming months. In the meantime, if you would like to discuss how OPRA can support your learning and development with proven, researched based soft-skill development programmes, then please contact your local OPRA office:

Wellington: 04 499 2884 or Wellington@opragroup.com

Auckland: 09 358 3233 or Auckland@opragroup.com

Christchurch: 03 379 7377 or Christchurch@opragroup.com

Australia: +61 2 4044 0450 or support@beilbyopragroup.co.au

Singapore: +65 3152 5720 or Singapore@opragroup.com

Posted in Emotional Intelligence, Engagement, Intelligence, Leadership, Organisational Culture, Personality, Team Building | Tagged , , , , , , | Leave a comment

Tips to spot a myth

Well there it is: another year down and another year to look forward to. This brings to an end this series on some of the myths of our industry and I wanted to finish by summarising some guidelines on how to become more critical about i/o research and the conclusions drawn from our discipline.

Our discipline is not all mythology, as shown in some of my recent posts such as the effectiveness of training and the value of personality testing. On the contrary, there is a growing body of findings that show what works, what doesn’t and why. However, claims move from fact to fiction when commercialisation and academic reputation takes over.

With this in mind, those attempting to apply research need a simple way to test the soundness of what they are reading. Here are my top 7 tips to spotting myths:

  1. Who has done the research? There are many vested interests in psychology. These include commercial firms touting the next big thing through to academics defending a position they have built for themselves. When you understand a person’s starting position you will read what they write with open eyes. When evaluating any claim ask yourself: ‘What is their angle and do they have anything to gain from such a claim? Are they presenting a balanced argument or reporting commercial findings in fair manner.’
  2. Are the claims too good to be true? Dealing with human behaviour is a messy business. Single variables, on a good day with the wind blowing the right direction, account for roughly 10% of the variability (e.g. correlations of r=0.3) in a given outcome (e.g. a personality trait predicting job performance). Unfortunately, the public are unaware of this and have expectations around prediction that are simply unrealistic. These expectations are then played on by marketing companies that make claims such as ‘90% accuracy’. These claims are outrageous and a sure sign that you are very much again in the clutches of a myth.
  3. When looking at applied studies does the research design account for moderator variables? Psychological research often fails to be useful by failing to account for moderator variables. Too often we get simple correlations between variables without recognising that the entire finding is eroded unless certain conditions are met or if another variable enters the scene.
  4. Is the research discussed as part of a system? Building on from the previous point, research that does not discuss their findings as part of a wider eco-system is invariably limited. As scientist-practitioners, our work does not exist in a vacuum. It is part of a complex set of ever changing intertwining variables that go together to produce an outcome. Selection leads to on-boarding, leads to training, leads to performance management and so on and so forth. Research needs to identify this system and report findings accordingly.
  1. Are the results supported by logic as well as numbers? Nothing can blind the reader of i/o science like numbers. As the sophistication of mathematical justification in our discipline has grown the usefulness of many of the studies has dropped. Psychology is as much a philosophy as a science and logic is equally as important numbers to demonstrating an evidence base. Look for studies that follow the laws of logic; where hypotheses are not only supported, but alternative theories dismissed. Look for studies that are parsimonious in their explanation but not so simplistic that they fail to account for the underlying complexity of human behaviour.
  2. Are the results practically meaningful? Don’t be confused by statistical significance. This simply means we have certain confidence levels that a finding was not due to chance and if the study is repeated we are likely to get a similar result. This tells us nothing of the practical significance of the finding (i.e. How useful is this finding? How do I use it?). Too often I see tiny but statistically findings touted as a ‘breakthrough’. The reality is the finding is so small that it is meaningless unless perhaps applied to huge samples.
  3. Be critical first, acquiescence second! If I have one piece of advice it is to be critical first and accept nothing until convinced. Don’t accept anything because of the speaker, the company, or the numbers. Instead make anyone and everyone convince you. How is this done? Ask why. Ask what. Ask how. If you do nothing besides taking this stance as part of a critical review, it will help to make you a far more effective user of researcher and a far better i/o psychologist or HR professional.

To all those who have read and enjoyed this blog over the year, we at OPRA thank you. As a company we are passionate about i/o, warts and all, and it is a great privilege to contribute to the dialogue that challenges our discipline to be all that it can be. Have a great 2015 and we look forward to catching up with you offline and online over the year.

Posted in Uncategorized | Leave a comment

The Myth that Training is an Art not a Science

For many training is seen as an art, and a black art at that, rather than a science. The idea that there is actually a science to training, and a methodology to be followed to ensure its effectiveness, is an anathema to those that view their own training as some special gift that they alone possess. Much like the claims in the psychometric industry that a single test is the holy grail of testing these outrageous training claims are the same myths that simply distract from the truth. On the contrary training is an area that is now well researched and there is indeed a science to making training work.

Building on from their seminal work on training for team effectiveness Salas and his team have produced an excellent paper outlining what the science of training, (Salas, E., Tannenbaum, S.I., Kraiger, K., & Smith-Jentsch, K.A. (2012) The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13, 2, 74-101).

The paper is a free download and is one of those must haves for all practioners. Firstly, the paper covers various meta-analysis that have been conducted on training and note that training has been found to be an effective from everything from managerial training and managerial leadership development through to behavioural modelling training.

Moreover the paper provides clear guidelines as to how to enhance training effectiveness. Building on the research the guidelines for practitioners include:

Pre-training recommendations (Training needs analysis)

    1. Analysis of the job
    2. Analysis of the organisation
    3. Analysis of the person
  1. Communication strategy
    1. Notify attendees
    2. Notify supervisors
  2. During training interventions
    1. Creating the learner mind-set
    2. Following appropriate instructional principles
    3. Using technology wisely.
  3. Post training
    1. Ensure training transfer
    2. Evaluation methodology

The paper in many ways is what our discipline is all about.;there is a strong research base, culminating research from multiple sources, with useful guidance for the practioner provided. This is applied psychology and this is the scientist-practioner model in practice.

As noted by Paul Thayer in his editorial to the paper:

“… There is a system and a science to guide organizations of all types in developing and/or adopting training to help achieve organizational goals. Salas et al. do an excellent job of summarizing what is known and providing concrete steps to ensure that valuable dollars will be spent on training that will improve performance and aid in the achievement of those goals. In addition, they provide a rich bibliography that will assist anyone needing more information as to how to implement any or all the steps to provide effective training. Further, they raise important questions that organizational leaders and policymakers should ask before investing in any training program or technology”.

There are many myths that pervade business psychology. Unfortunately these often result in the baby often being thrown out with the bath water and people dismissing the discipline as a whole. The key for any discerning HR professional and i/o psychologist is to be able to tell the myth from reality and have a simple framework, or check points, to be a discerning reader of research. More on this tomorrow in the last blog for the year.

Posted in Uncategorized | 1 Comment

The myth that training to improve team functioning doesn’t work

Yesterday we noted that there was little support for the Belbin team model. The idea that there is a prescribed model for a team is simply not supported and the Belbin model does not improve organisational effectiveness. Taking this into consideration, does training to improve team functionality actually make a difference?

I’m pleased to note that training to improve team performance is an area that is both well researched and the research is generally positive. Not only do interventions appear to improve team effectiveness, we also have an idea through research as to what moderates the success of team interventions.

In terms of the research around team training, the seminal work in the area was a meta-analysis conducted in 2008. For those not from a research background, a meta-analysis can be thought of as an analysis of analysis. The researchers bring together various studies and re-analyse the data to gain greater confidence in the results through establishing a larger sample size. While the technique has its critics and may lead to statistical over estimates, this is one of the better methods we have to establish an evidence base for generalisable trends in applied research.

The team training effectiveness meta-analysis was extremely thorough in examining both outcomes and moderators. A range of outcomes were assessed, including:

  1. Cognitive outcomes predominantly consisted of declarative knowledge gains.
  2. Team member affective outcomes included socialisation, trust and confidence in team members’ ability and attitudes concerning the perceived effectiveness of team communication and coordination processes.
  3. Team processes  included behavioural measures of communication, coordination, strategy development, self-correction, assertiveness, decision making and situation assessment.
  4. Team performance integrated quantity, quality, accuracy, efficiency and effectiveness outcomes.

Moderator variables included:

  1. Training content (taskwork, teamwork, mixed)
  2. Team stability (intact, ad hoc)
  3. Team size (large, medium, small)

While a blog post is not sufficient to explore the research in depth, suffice to say that moderate to strong positive outcomes were found for all four outcomes. Team process appears to be the most malleable. Training teams to communicate better, avoid group think, make effective decisions and think strategically, is likely to be an investment that delivers returns for organisations. Training to improve affective outcomes, such as trust and confidence in team members, appears less effective. This was especially the case when applied to large teams.

Aside from team size, the results were moderated by team stability with well-established teams responding better to training than ad hoc teams. Training content had limited effect on the outcomes of the training with both task work and team work oriented interventions producing positive results.

The results of this meta-analysis are encouraging for i/o psychology. Team effectiveness is an area where there is a strong research basis for intervention and where intervention is likely to have a positive impact. This is an area where the scientist-practitioner model that is central to our discipline appears to be alive and well.  We have interventions that are well researched and have some understanding of the levels of effectiveness taking into account other variables. Does this lead to science of training? Are there principles we can take from the literature that can be applied to make training effective? Or is training an art and not a science? This is the question for tomorrow.

Posted in Uncategorized | 1 Comment