We all owe an ongoing debt of gratitude to the Independent Sector (IS) for its role in attempting to quantify the giving of money and time in the United States. While it is unfortunate that a subject as broad as this continues to rely on one periodic study, we were at least beginning to gather comparable data. Therefore I was dismayed to discover that IS - without any notice of which I was aware - changed both its data gathering company and the parameters of its study this year. The most disappointing part of the change, which is explained on page 4 of the summary report at http://www.independentsector.org (not available as of 2014), is that the data now refers to adults age 21 and older, instead of the previous age 18. Ironically, I encountered exactly this issue a few months ago in Australia and challenged some government officials there on this very point. Even the forward-looking Canadians have limited their data gathering to adults.
How can we continue to proclaim our interest in encouraging youth to volunteer if we do not collect data on whether or not they are? The first study done on volunteering in the United States in 1970- and the only one done by the federal government - looked at Americans aged 14 and up. Why could this not have continued? Even age 18 was not old enough to give an accurate picture of youth service. I suspect it's because - rhetoric notwithstanding - these studies care much more about the amount of money given than about volunteering. And 14-, 16-, and even 18-year-olds are not major check writers.
Even beyond an acccurate estimate of volunteering by youth, we will never have meaningful data on volunteering in general if it is forever entwined with financial donations. No study can cover these two broad topics well at the same time. In fact, we ought to be looking to a variety of research so that we can begin to understand what people do in their communities now. There is already a wealth of information out there but we have never tried to gather, disseminate, or analyze it. I have long felt that it is insufficient to ask individuals, "do you volunteer?" We have ways of calculating the amount of money donated to organizations from the financial and tax reports of those organizations, for which the question to individuals of "do you give money?" simply rounds out the picture. Why can't we get some data about volunteering from the both the organizations that benefit from the services of volunteers and those who organize volunteering?
- Where are data about the quantity and quality of volunteer involvement in the annual reports of nonprofit organizations? For too long we have allowed accountants to dictate this whole issue, as if the only meaningful way to "account" for volunteers is the dollar value of their service. This is a red herring. It ought to be possible for organizations do provide some basic numbers: how many people volunteered this year? what were their ages, gender, race, etc.? If the argument is that this information is too difficult to collect, we ought to respond that maybe the problem is no one cares. Big Brothers/Big Sisters, Girl Scouts, literacy councils, hospitals, and the vast majority of formal settings sure ought to know how many people they are involving. They count employees. They count clients. Why not count - and report - volunteers? The American Cancer Society has been saying for over twenty years that it has "one million volunteers." As that advertised number has never changed, it makes me wonder whether or not they have ever actually done a count.
Further, the annual reports of these organizations contain much more than financial statements. There are always essays and photographs produced for the purpose of informing the public and funders about the accomplishments of the organization. Why are volunteers so rarely highlighted? Or tacked on as an afterthought (don't tell me it's alphabetical order - I've heard that one, too)?
It also ought to be possible to develop reporting systems that answer some interesting questions, such as: how many volunteers come in as individuals vs. in groups? what about families volunteering together? how many students come from mandated school programs and do they remain longer than they are required to do so? does the profile of volunteers match the profile of the organization's clients and/or the community it serves?
Financially, we can compare organization to organization in terms of the ways they spend their money, or the amount of excess revenue over expenses. Why can't it be possible to assess and compare the involvement of volunteers? (And why aren't more foundations asking such questions?)
- Where are reports about community service organized by schools? Given the official nature of these settings, and the type of data gathering that is done there routinely on other subjects, why can't we expect to learn what students are doing in the community, how many hours they are doing it, whether or not they stay active beyond minimum requirements, and what impact they may have had?
- Parenthetically, it occurs to me that Independent Sector might at least ask the adults they survey whether young people in the household are involved in volunteering, too.
- All-volunteer associations are equally liable for lack of reporting. We have heard a great deal about the reduced level of membership in civic clubs, fraternal groups, and other membership organizations. But what about the members they do have? The real question here is distinguishing between people who join a group and those who actually do volunteer work for and through it! Some honest disclosure, while potentially embarrassing, might actually help these organizations to improve their recruitment of new members. How about some collaboration on a format for reporting such data as: number of new members; average duration of membership; age, gender, and race profiles for "members" and for officers and other truly active participants?
- Why can't youth-involving organizations report on the accomplishments of their participants? It is more than folklore that Boy and Girl Scouts do community service, as do faith-based youth groups, sororities and fraternities, etc. Would it be so very hard to agree on a few basic pieces of information that all these groups would collect and report?
- As long as I've been in the field, I have witnessed the tension between Volunteer Centers and other referral agencies who want to assess their effectiveness in directing people to volunteer opportunities versus the agencies who ultimately "get" the volunteers and see such involvement as "belonging" to them. Let's get our act together! Any community in the world where there is a volunteer center or similar body, or where there is a DOVIA or other professional network of volunteer program managers, can decide - on its own - to collect and report "The State of Volunteering Here." It simply takes will. It doesn't even take money. But it does take a commitment to volunteerism as a field, without concern for "ownership" of information. With computers, listservs, data gathering Web capability, and other electronic tools, this is no longer a major chore. The cumulative information that we ourselves are sitting on is so incredibly important.
By the way, we do not even have an accurate count on how many directors of volunteer services are out there. We only have estimates. Because our job titles vary so enormously and because so many of our colleagues wear multiple hats, we are even harder to count than volunteers! But shouldn't we occasionally try? Now there's a way to celebrate International VPM Day!
All of this, of course, speaks purely to counting heads. I have not even mentioned a tally of number of hours, largely because this has always seemed relatively meaningless to me as a measure of quality of service. In fact, I have routinely challenged interpretation of previous IS studies in which people bemoan the discovery that the number of volunteers has gone up but the average number of hours served has gone down. Perhaps this is a wonderful piece of information. Maybe it means that volunteers have grown so effective in their service (or are so well managed by staff who understand volunteer administration principles) that they can accomplish more in less time!
This topic is - as so many are - critically connected to a number of other issues. For example, the debate about vocabulary is front and center here. If we are going to count volunteers, whom do we mean? Do we count board members? Stipended Ameri*Corps participants? Student "interns"? But I'm at the point of not caring. In the absence of any study of value, any new contribution is a start in the right direction. If we begin in our backyards, we can eventually cultivate the entire field.
Receive an update when the next hot topic is posted!