- Working Evaluatively: How much do we learn from our own experiences? - February 19, 2016
The development of evaluation research capacity faces some inherent challenges. Among others, the knowledge and skills that are shared should be appropriate for the context, and methodologies should be carefully designed to ensure that the social practice of power is not reinforced. Research agendas as with development agendas are subject to powerful force fields that include elements of personal choice and existing social capital within society.
A few months ago I made a presentation at a SAMEA Evaluation Cafe on “Thinking Evaluatively about Social Issues”. I made it clear that ‘evaluative thinking’ was not a substitute for evaluation, but that thinking evaluatively about our own practice was a central part of doing an evaluation. It involves identifying your own assumptions, and posing thoughtful questions BEFORE embarking on the methodological actions involved in evaluation.
The presentation was well received, but afterwards I thought it would have been useful to share some of my own research development experiences to make my point. So I share them here.
The novice fieldworker
During my final year of undergraduate studies at university, I signed up to do paid fieldwork. I had to administer a survey for a Psychology professor, and it was my first experience of ‘research’.The survey was aimed at assessing the respondents’ perceptions of crime in their area in order to compare the perceptions of those who had been victims of crime to others who had observed criminal activities, or were aware of them in the areas where they lived. We were exposed to examples ranging from simple random sampling to non-probability sampling techniques. We were given the rationale for the sampling technique and received training in how to ask the questions.
My geographic area was a section of a township with which I was familiar. I had to visit 48 appropriately sampled houses between 6:00pm and 7:30pm each day to ensure that the head of the household was present.
My first two interviews took around 25 minutes each to complete, as planned. The high fence of the third house had a gate with a thick chain tied with a huge padlock. A big dog greeted me with ferocious barks through the slatted gate. A tinny voice, shouting through the burglar barred window, asked me what I wanted. I identified myself as a university student doing a survey. After many clicks of opening door latches and locks, an old lady emerged to calm the dog and open the gate. She invited me inside and offered me tea.
After a while she asked me to explain the purpose of the research. I did this to the best of my ability. She responded with a simple question. “After you have collected all the information – what are you going to do about it?”
Thinking that she was referring to the physical data, I said that my professor would analyse the data and make sense of it. She said that this was fine. “But what are you going to do about the crime?” she asked. I did not have an answer. She responded to all the survey questions. I thanked her and indicated that it was time for me to move on. We said our goodbyes through the slats in the padlocked gate. Her final words followed me: “Let me know what you are going to do about the crime.”
It was too late for me to continue that night. The next day I went to the professor confronted him with the old lady’s question. He responded that it was not his responsibility to do the work of the police; he was merely documenting perceptions of crime for an article that would become a chapter in a book about crime in South Africa.
I refused to administer another questionnaire. I was not paid for my efforts, and was highly sceptical of this ‘ivory tower’ research.
Ironically, years later I joined the same ‘ivory tower’ environment. I have since taught survey research techniques to hundreds of students, yet this first experience has always guided my teaching and use of research methods in my work.
Reflecting while doing
A few years ago I responded to a call for a needs assessment assignment for a community college based in a rural area. The terms of reference required a clear sampling frame, a draft questionnaire and the utilisation of 16 local graduates to administer the survey as capacity building component. The community college was built with the financial support of a German benefactor who responded to the request of a local councillor for help with skills development in the areas of basic literacy, farming, construction and handcraft. The funder wanted to ensure that the skills identified were indeed needed by the nearby community.
The new college building was located on the outskirts of a rural town in order to serve both the town and a community scattered over several kilometres of sparsely populated hills. Sixteen local graduates were trained in questionnaire design and interview techniques. The results of the pilot survey revealed that the community had little or no understanding of the concept of a ‘community college’. Men in particular did not think that ‘going back to school’ was a good idea. They balked at the idea that they needed skills training in farming, something they had been doing all their lives. The young graduates also experienced great difficulty engaging the older interviewees, who wanted to know why they were not consulted before the college was built.
The results were shared with the client. Reflecting evaluatively on our experiences, we realised that we needed an alternative approach. ‘Older’ facilitators were used to conduct a number of community focus group sessions. Community members indicated when it was convenient for them to meet, and were then invited to attend a session closest to where they lived. The groups were organised by gender, with between 12 and 20 participants in each group. During each meeting, the facilitators informed the participants of the general purpose of the college, and encouraged each person to comment. They could ask any question related to the development and future of the facilities. They then entered into deep discussion.
The sessions lasted up to five hours. Towards the end of each session, the majority perspective was summarised and presented to the group to confirm that it indeed reflected their collective opinion. Based on the findings, a report was drafted and a summary shared at a public event that included those who attended the focus group meetings and many more.
The community college is now headed by a woman who is the principal trainer of Early Childhood Development (ECD) practitioners associated and enrolled with a university in the province. The ECD students use the pre-school at the college for their practical training purposes, and the children who come from surrounding areas are subsidised by the university.
The second workshop space is used by women who train other women in traditional handcraft. These crafts are sold on weekends at a market on the college premises, where local small-scale farmers also deliver their produce to be sold. One of the classrooms gets used by a church group every Wednesday and Sunday for its services. Through the contributions and subsidies of many partners, the college is now self-sustaining.
Thinking evaluatively, with sensitivity
Evaluation can play a meaningful role in shaping and directing action. However, being tied to a certain methodological approach blinds one to the array of more appropriate options available for a specific context. Knowledge construction has a methodological and values component. The methodological component focuses on the designs, measures and analyses that we use to create and arrange data. The values component concerns the kinds of things we can know about interventions, programmes and policies that enable us to penetrate our assumptions and those held by participants.
Thinking evaluatively about our own values, preconceptions and methodological preferences when dealing with social issues is not only a useful orientation when we do our work – it is essential. It allows us to be constantly aware of the multiple perspectives on knowledge construction that we have to consider if we are to be good evaluators.