The Everyday Experience of Research

“The social sciences need to conquer their academic inferiority complex.”  So say two professors of political science at the University of Rochester in New York.  Clarke and Primo’s term for this inferiority complex is “physics envy.”  I like it.

“Physics envy” captures in a nutshell the lingering sense among interpretive researchers of the need to somehow be more like “real” research in order to meet the expectations of the scholarly world.  Clarke and Primo say that those of us doing qualitative research strive to be, as much as possible, like what is perceived as the “real” thing, often using the language and design elements of the “hard” sciences such as physics and chemistry in our research.  Their claim is that we try to represent ourselves as part of the hypothetico-deductive method.

I am an interpretive researcher, and I have been guilty of physics envy.  In at least two of my past research studies, I used randomization to create the study samples:  1) a national sample of 250 occupational therapists in the United States, generated from the AOTA membership records; and 2) a sample of day care centers, gleaned from the member list for the Wisconsin Day Care Association.  The first AOTA sample was for an ethnographic study I did with Virginia Dickie on the meaning of doing occupational therapy and the second sample was for a statewide phenomenological study of day care staff experiences of working with older people with dementia.  Clearly, these were hypothetico-deductive methods of sample selection inserted into two interpretive studies.  Rightly or wrongly, we viewed this use of randomization as simply a convenient way to get a sample, yet on some level I think I also felt like this method added to the “robustness” of the studies.  And sure enough, in the dementia day care study, one reviewer of my funding proposal cited the “random sample” as a strength.

We can also turn this example of physics envy sort of inside out.  As noted above, physics envy may show itself by the insertion of specific scientific methods into an interpretive research study.  But I suggest that a yearning to not be seen as doing anything but “real” research is also present when we consciously ignore any and all experiential aspects of a hypothetico-deductive research study. This approach to research and research reporting is demonstrated when all the experiential components of a scientific research study are carefully eliminated, or as Clarke and Primo say, when “everything messy and chaotic about scientific inquiry [is] safely ignored.”  This I have also done.  My master’s thesis provides a good example.

My master’s thesis was a study of aging and postural sway in women.  Collection of data (area of postural sway in two age groups of women in two stance positions) depended on successfully taking synchronized photographs of scale readings:  1 per second, 18-second trials, 6 trials per subject, 20 subjects = 2160 center of gravity readings total.  From all these center of gravity points, I was able to calculate the areas of postural sway for each subject and statistically compare areas of postural sway between the age groups and the two stances.  

I published the study in The Journal of Gerontology.  In the section describing the data collection and reduction, I stated the following:  “Synchronized biplane photographs of the scales, taken at specific intervals, enabled the investigators to determine the vertical projections of the center of gravity on the base of support over time.  Scale readings were obtained from the negatives by using a Model P-40 Recordak.”  This dry, factual statement stripped the complicated process of photographing, developing, and reading hundreds of negatives of all human experiential aspects.  In reality, the entire process of film development and reading was filled with intense anxiety.

For example:

1. I received instruction from a research assistant on the process for developing the film so that I could work independently.  I found the hours spent alone in the darkroom — unloading the camera, coiling the film carefully into tubs of different kinds of solutions and being careful not to twist the film so as to enable the solution to bathe all parts and surfaces of the film evenly — were nothing short of terrifying.  To ruin one of the seemingly endless strips of film during the developing stage would have been a huge setback.  I remember thinking I am never doing this kind of research again.

2.  I relied on a lab assistant to load, focus and run the camera — a factor that was certainly helpful in one way, but nearly fatal in another.  I realized later that I should have taken it upon myself to know more about the camera and the synchronized photo process because, as I spent hours and hours reading the dial scale numbers on the 2160 negatives, the numbers gradually became more and more blurry, eventually becoming unreadable.  The camera had slowly lost focus.  I was saved by the fact that the hands on the dials were still visible and I could switch to using a protractor-type device to determine the readings.  But until this solution evolved, I was, once again, pretty much terrified.  Now, in looking back to the statement in the published account (“Scale readings were obtained . . . “), I can see that I definitely stripped the process of all human experiential aspects.

I am not suggesting that any or all experiential details should be included in a scholarly write-up for publication.  But neither should they be “safely ignored.”   Experiential aspects are important for us to pay attention to in our own research — to be open to, to learn from, to reflect on, to share with colleagues, and to consider seriously as we develop and plan our research careers and assist graduate students with theirs.

Experiential aspects of research in many ways represent the everyday-ness of research.  After all, to carry out research is to experience the everyday occupation that it offers.

It seems desirable for all of us to reflect on Clarke and Primo’s concept of physics envy, no matter if we are interpretive or hypothetico-deductive in our approach.  Whether or not the examples above actually represent physics envy is probably debatable.  Whether or not those of us who are researchers in the social sciences actually possess physics envy is also likely debatable.  The ideas seem worthy of thought.

Clarke, K.A. & Primo, D.M. Overcoming ‘Physics Envy’ (Sunday, April 1, 2012, The New York Times).

Hasselkus, B.R. (1997).  Everyday ethics in dementia care:  Narratives of crossing the line.  The Gerontologist, 37, 640-649.

Hasselkus, B.R., & Dickie, V.A. (1994).  Doing occupational therapy:  Dimensions of satisfaction and dissatisfaction.  American Journal of Occupational Therapy, 48, 145-154.

Hasselkus, B.R., & Shambes, G.M. (1975).  Aging and postural sway in women.  Journal of Gerontology, 30, 661-667.

2 thoughts on “The Everyday Experience of Research

  1. Your observations regarding the need to share the “everyday-ness” of the research process seems very apt; in particular, regarding the expectations of students who are beginning to understand and experience research. They cannot possibly envision the unknown complications that may arise, yet hearing of others’ challenges may facilitate greater understanding of what the research process entails, including barriers, facilitators and delays. Despite these unknown challenges, researchers’ sharing their lived experience provides a long view of the process that is unfamiliar to the novice investigator. Furthermore, regardless of what methodology is utilized (e.g. quantitative, qualitative, mixed methods, etc.), by including commentary on the research experience, the entire investigative process can be better synthesized.


  2. As a social and qualitatively oriented researcher, I agree the term “physics envy” sheds some light on some of our insecurities (and aspirations). There is probably a part of all of us that covets causation and strives for certainty to the degree we perceive it over on that side of the fence where the grass looks so much greener. I’d like to be involved in research that neatly and categorically determines principles that can be generalized to a whole population.

    However, on this side of the fence, I also aspire to research embedded in the reality of people’s lives and day to day experiences. Research that connects and engages with the everyday-ness of their life, their health, their disability, their activities, their participation, their environments and the services that impact on them. I also know that each person is unique and we rarely fit into neat categories.

    While I’m far from an expert on hypothetico-deductive scientific methods, I harbor lingering suspicions that those those “real” methods will actually tell me much about “reality” or everyday-ness. But on the other hand, I’m still left dissatisfied with idiographic approaches that can only vaguely suggest how meaningful a concept is to one person.

    So I try to straddle the fence. If there is an opportunity to explore qualitative experiences, but to do so across a large random sample (1), I’ll have a go. I’m keen to try and grasp people’s everyday realities, but I also want to be able to draw conclusions that have relevance beyond one or two. I am also happy to count frequencies of occurrence of thematic analyses (2) to understand not just the strength of what has been said, but how representative those concepts are across a whole data set. I recognize that such approaches have considerable limitations and leave purists shaking their heads, but they also provide some useful findings.

    A challenge I see is developing (or co-opting) methods which are qualitatively and quantitatively convincing and meaningful. The fence can be very uncomfortable. Some methodological supports and shared experiences may help.

    (1) Kuipers P, Kendall M, Amsters D, Pershouse K, Schuurs S. Descriptions of community by people with spinal cord injuries: Concepts to inform community integration and community rehabilitation. Internat J Rehabil Res 2011 Jun;34(2):167-74.
    (2) Kuipers P, Wirz S, Hartley S. Systematic synthesis of community-based rehabilitation (CBR) project evaluation reports for evidence-based policy: a proof-of-concept study. BMC Int Health Hum Rights. 2008;8:3.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s