| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

3-11-11: Proposals for Data Element Definitions - FY2011

Page history last edited by Kim Miller 13 years ago

This page has been locked from further commenting.  Thank you for your comments. 

 

FY2011 Endorsed data elements.

 

These data elements were discussed further by the LSWG at their December meeting and a teleconference in February.  The 4 new data element include the the LSWG changes. 

 

Comments are invited until COB, Wednesday, March 23.  The ballot will go out on Friday, March 25.  There will be a 3 week balloting period.  All ballots are due by COB, Friday, April 15.  Ballots must by signed by SDC and Chief Officer. 

 

 

PDF version of Draft 2011 PLS Data Element Ballot  Ballot - FY2011PLS Data Elements - draft.pdf 

 

datalifediagram.pdf    45 states must vote to certify ballot.  Items must be approved by 2/3 of the vote. 

 __________________

Endorsed Additions (4 items):

 

1. Available Upload and Download Speeds at Computers used by the General Public. (Collected at the outlet level and reported out at the outlet level.)[1]

 

 

Definition:

The available upload and download speed at each outlet as determined by initiating a connectivity speed test through a national public library speed test web portal at an Internet computer used by the general public. Each library outlet will be asked to access the public library speed test portal after the close of business during a time designated by the state library.

 

Process for Testing Upload/download speeds:

IMLS has identified the following process for collection based on interviews with representatives from the Federal Communications Commission and two of the main vendors that provide speed test services, MLab and Ookla (vendors that provide the speed tests for the FCC’s Broadband.Gov site). 

 

To tailor the collection to libraries IMLS would have to build a collection web site that incorporates the test tools from one of the vendors.  The web site would provide clear instructions for speed tests that are comparable to those found on the FCC’s sites (http://www.broadband.gov/qualitytest/about/).

 

Collection through the site will require libraries to enter institutional identifiers to link the test to their library.  The FSCS ID will be used.  In addition to the FSCS ID, the institution name and address will be collected to verify the institutional record should the FSCS ID be missing or invalid.  Libraries will also be asked whether or not they are running the test as part of the PLS collection (Y/N) and whether or not the test is being run outside of normal business hours (Y/N).  All tests with valid FSCS IDs that identify the test time as part of the PLS and are run outside of business hours will be recorded as part of the PLS collection and will augment a database on the collection server.  If libraries conduct more than one test, signifying that the test is for the PLS and outside of the normal business hours the results off all such tests will be averaged for purposes of the PLS data reporting.  NOTE: Respondents without a FSCS ID will be allowed to run the speed test but the data will not be recorded as part of the PLS collection.

 

The speed test takes approximately 1.5 minutes to complete.  The same library staff that respond to the PLS survey will be asked to complete the test.  Staff would be asked to run the test the week prior to the end of the library’s PLS reporting cycle.  Libraries will further be asked to run the test during the outlet’s closed hours (in the morning or the evening).  Doing so will allow the test to measure maximum capacity.  The results of the test will be displayed within minutes of starting the test, should they care to record the information for their own records.  For the PLS collection there will be no reason to do so as the speed test results will be stored on a server hosted by IMLS.

 

Reporting the Data on the PLS Files:

Data collected through the IMLS speed test site will be reported as part of the PLS data file and the PLS report.  State level summary reports will be available prior to the PLS report, provided 70% or more of the public libraries in the state conduct the tests through the site.  State Library Agencies have the discretion to request more tests from their libraries than the one PLS test however this data will not be recorded as part of the PLS dataset.   

 


[1] This data element will not be collected for FY2011.  The plan is to collect for FY2012. 

***************

 

How it will appear in WebPLUS:

Not applicable – numeric value automatically reported via the speed test

 

Rationale:

The utility for patrons of a public access computer is greatly affected by the Internet speed available at that computer.  The number of public Internet access computers is currently reported in the PLS, but there is no question that describes the quality of the access available to patrons.  By collecting the available speed at each outlet we get information about the actual Internet speed that is available to a user, which is a determining factor for what Internet content they can access, e.g., workforce training or distance learning, and for making decisions regarding if and how they use the library for their computing purposes.

 

Only when speed is measured at the local level do the data help reveal uneven access to the Internet. We know from past state and national data collection efforts that libraries with multiple outlets have variable broadband available among outlets (based on purchased speed). Testing the available broadband per outlet may reveal even greater variability, because advertised speeds are not always achieved.

 

This information would allow clearer statements about the quality of public access computing in library outlets as well as if/how it is changing across time.  Availability of this information at the local level will help library leaders engage local providers and policy makers in making the highest, cost effective broadband available. At a national level, the data will provide libraries with comparative information that will allow them to advocate for improved public access Internet connectivity at the system, region and state levels. 

 

The following 12 states collect connection speed at the outlet level: Alaska collects the total bandwidth a library gets for the cost of their subscription and the library’s Internet access per computer; Arizona collects the Maximum Speed of Connection; California collects the fastest Internet connection speed available at the library; Colorado collects the fastest speed of the connection between the library and its Internet Service Provider; Connecticut collects the subscribed-for speed of connection (the speed of connection purchased or purchased for the library); Idaho collects the theoretical optimum download speed available to the general public; Illinois collects the maximum download speed of a library's Internet connection; Michigan collects Connection Speed by Outlet; Minnesota collects Internet Speed for Public Computers; Missouri collects Connection speed at main and branch libraries; Nebraska collects Maximum download speed of their Internet connection; and New Mexico collects Speed of the connection to the ISP.

 

Considerations: 

Several considerations regarding the connectivity speed data element were taken into account during the course of proposal development.

Data collection burden for libraries and states:  In addition to providing a more accurate portrayal of the connectivity available, this approach would reduce the number of data elements collected (when compared to the administrative-level max/min data elements approach), and reduce the reporting and accuracy burden on local libraries by automatically cataloging the results remotely.  Collecting data through an automated speed portal will also result in more valid and reliable data rather than relying on libraries to self-report connectivity speeds.

_______________________________________________________________________________

_

2. Number of Up-to-Date Internet Computers Used by General Public (actual number of computers; collected at the outlet level and reported out at the system level)

 Question:

How many of your library’s Internet computers used by the general public were first put into use in the past fiscal year? 

 

 

Definition:

This is the total number of public access Internet computers at the library that were first put into use during the past fiscal year, including personal computers (PCs) and laptops.  This number includes new computers that were purchased or leased by the library in the past fiscal year, computers that were refurbished during the past fiscal year, and computers that were donated to the library only if they were first put into use or refurbished during the past fiscal year.  If the provenance of a donated computer is not known, do not include it in this number. If the library uses thin client or other server side computing that makes the age of the workstation irrelevant, please enter 9999 in the response box.

***************

 

How it will appear in WebPLUS:

The question will appear as listed above.  Responses will be recorded as numeric entries.

 

Rationale: 

The number of public Internet access computers is currently reported in the PLS, however there is no question that describes the quality of the access available to the patrons of a library system.  This information would allow clearer statements about the quality of public access computing within a library and if/how it is changing across time. Unless the library uses thin client or other server side computing that makes the age of the workstation irrelevant, the utility for patrons of a public access computer is greatly affected by the age of the computer. Computers put into service in the past fiscal year is a proxy for the capacity of the device to sufficiently run current software and applications typically in use by the public.

 

Availability of this information at the national level will provide libraries with comparative information that will allow them to advocate for improved public access hardware at the system, region and state levels. 

 

(Vermont collects the approximate age of public access computers, i.e., the number of public access workstations newer than 1 yr, 1-2 yrs old, 3-5 yrs old, and older than 5 yrs) 

 

See Definition for [item # 650] Number of Internet Computers Used by General Public:

//PLS Item 650//.

 

Considerations: 

Several considerations regarding the content of the up-to-date hardware data elements and response options were taken into account during the course of proposal development.

Providing age cut-off for “up-to-date” versus asking for quantities of computers that are several ages: To minimize data collection burden, the decision was made to predetermine the cut-off for what should be considered an “up-to-date” computer rather than ask for the number of computers of several different ages, which would allow more flexibility on the analysis end.  The decision also reflects a degree of consensus in the field that computers should be replaced every 3-5 years to sustain adequate access to productivity software and on-line resources. 

IMLS staff spoke with John Bertot from UMD to discuss the reliability and validity of a similar question that is reported by libraries for the Gates/ALA survey.  He reported that the Public Libraries and the Internet study asks more detailed than the question posed above.  Despite this he saw no significant challenge to answering the question.  However, his survey is asked at the branch level and he felt the question would have greater utility at that level of analysis.

 

________________________________________________________________________________   

 

3. Expenditures on Public Access Hardware during the Past Fiscal Year (actual dollar amount; collected and reported at the system level)

Question:

How much did your library spend for replacing and upgrading hardware for use by the general public during the past fiscal year? 

 

Definition:

This amount is the cost to the library of replacing and upgrading public access hardware (any hardware used for public access; i.e. servers, pc’s, laptops, printers).  It includes only expenditures on hardware that came out of the library's budget, or expenditures that came osut of monies given to the library that were allotted to pay for hardware based on the library’s discretion.  It does not include expenditures made by other entities on behalf of the library.

 

How it will appear in WebPLUS:

The question will appear as listed above.  Responses will be recorded as numeric entries.

 

Rationale: 

Total Capital Expenditures and Other Operating Expenditures are currently reported in the PLS, but there is no way of determining how much of those expenditures are made in support of public access technology. Availability of this information at the national level (paired with information about the size of service area populations) will provide libraries with comparative information that will allow them to advocate for improved public access hardware at the system, region and state levels. 

 

Rather than restructuring existing Other Operating Expenditures and Total Capital Expenditures data elements and definitions, this proposal adds new data elements on public access technology expenditures that would be the least challenging for systems to report, i.e., those made for hardware and Internet service for the general public.

 

The State Data Coordinators may want to consider modifying the existing data elements over time to get to the point where the public access technology expenditures are separated from the Other Operating Expenditures and the Total Capital Expenditures. This will take several years and preparation by each of the State Data Coordinators.

 

(The District of Columbia collects expenditures on new computers and the IT Staff payroll that supports public access computers and the number of computers installed each quarter; Indiana collects Public Use Computer Database Licensing, Maintenance and Purchase Fees and total expenditures associated with Public Access Computers; Nebraska collects Electronic Access expenditures, including expenditures associated with access to electronic materials and services and computer hardware and software used to support library operations; New York collects Telecommunications fees, including those for telephone and Internet operation and installation; South Carolina collects expenditures for all furniture and equipment purchased or leased with funds from the recurring operating budget, including both electronic and general equipment furnishings; Washington collects Technology Expenditures, including the costs of computer hardware and software used to support library operations or to link to external networks, including the Internet) 

 

 

IMLS Review/Research after December LSWG Meeting

IMLS staff spoke with John Bertot from UMD to discuss the reliability and validity of a similar question that is reported by libraries for the Gates/ALA survey.  Dr. Bertot maintained that the financial information is the most challenging data to collect because not all of the libraries budget IT services in this manner. However, he also noted that this data element is at a much higher level of aggregation, and doesn't have as many breakdowns by funding source and IT category as the ALA/Gates survey and therefore could enjoy greater reliability.

 

________________________________________________________________________________

 

 

4. Established Replacement Plan for Public Internet Computers (collected and reported at the system level)

 

Question:

Does the library have a written replacement plan for its Internet computers used by the general public? 

 

Definition:

A library has a replacement plan.  The plan should be a written and approved plan or policy for replacing computer technology.  A library does not have a replacement plan if they replace computers on an “as needed” basis, according to criteria that have not been formally defined or documented in an approved plan.

 

NOTE: Definition was modified by LSWG in the December meeting.

 

***************

 

How they will appear in WebPLUS:

The question will appear as listed above.  Responses will have at “Yes” or “No” options in a drop down list in the WebPLUS application.

Rationale: 

Although a report of having expended $0 on Internet service and public access hardware during a particular fiscal year may reflect budget reductions or a cyclical approach to such expenditures, collecting whether or not a library has a replacement plan will help clarify these instances.  Furthermore, independent analysis of data from the Public Library Funding & Technology Access Study by the U.S. Libraries Program at the foundation, show that a library having a replacement plan, regardless of formality and content, is robustly related to concurrent and future measures of a library’s public access quality and investment. Applications for E-rate no longer require technology plans.

(Alaska collects the presence of a technology plan that is approved by the Alaska State Library; Oklahoma collects the presence of a Technology Plan and the dates the plan covers) 

 

Considerations: 

Several considerations regarding the content of technology expenditures data elements and response options were taken into account during the course of proposal development.

Redefining the existing definition of Other Expenditures to exclude technology expenditures and establishing a new Technology Expenditure category versus adding technology expenditure questions with no implications for existing measures: This approach would not have implications for existing survey questions, minimizing the burden of changes to existing survey measures.

Recognizing the value of giving libraries an opportunity to more gradually think about breaking out technology expenditures as its own line item: Although, it would be useful for libraries’ technology-related advocacy efforts to be able to state how much they spend on technology, most libraries do not currently track technology expenditures as a single line item in their budgets.  Rather than abruptly requiring libraries to do this, this approach makes incremental steps toward such a goal, so that libraries can get practice tracking these expenses separately and recognize on their own the value of doing this. 

Stricter versus looser definition of replacement plan: Recognizing that “replacement plan” can mean different things to different libraries, this question qualifies the replacement plan as “established”.  Correlation research conducted by the US Libraries Program at the foundation shows that a loose definition of replacement plan has a statistically significant and robust relationship with an index of quality public access (comprising connection speed, sustained up-to-date hardware and sustained technology funding).

 

________________________________________________________________________________

 

 

Endorsed Change (1 items):

 

1. 450 Print Materials

 

Report ONLY BOOKS IN PRINT:

1. Books in print. Books are non-serial printed publications (including music and maps) that are bound in hard or soft covers, or in loose-leaf format. Include non-serial government documents. Report the number of physical units, including duplicates. For smaller libraries, if volume data are not available, count the number of titles. Books packaged together as a unit (e.g., a 2-volume set) and checked out as a unit are counted as one physical unit. 

2. Serial back files in print. Serials are publications issued in successive parts, usually at regular intervals, that are intended to be continued indefinitely. Serials include periodicals (magazines); newspapers; annuals (reports, yearbooks, etc.); journals, memoirs, proceedings, and transactions of societies; and numbered monographic series. Government documents and reference tools are often issued as serials. Except for the current volume, count unbound serials as a volume when the library has at least half of the issues in a publisher’s volume. Report the number of physical units, including duplicates. For smaller libraries, if volume data are not available, count the number of titles. Serials packaged together as a unit (e.g., a 2-volume serial monograph) and checked out as a unit are counted as one physical unit.

 

 

Reason for change: 

A library director pointed out that the current definition harks back to olden times when libraries bound volumes and used print copies of Reader’s Guide to Periodical Literature to locate individual articles.  Most libraries no longer bind periodicals and now rely on databases such as ProQuest to locate and obtain individual articles if the publisher itself does not provide past articles online.  Due to space constraints, printed back issues are typically kept for just several years, circulated, and, relative to the pre-online database era, weeded quickly.  The library director contends, and I agree, that it’s time wasted to count and calculate how many single issues she has and how many total more than half a subscription and then add volumes and subtract individual issues from her ILS to come up with an accurate number.  “If it has a bar code, it’s a volume” and the ILS system delivers the number pronto. 

 

 

Comments (9)

Kim Miller said

at 1:02 am on Mar 22, 2011

it was pointed out to me that the rationale in # 2 had the previous words "past four years". This has been updated to match the definition "past fiscal year".

Katina Jones said

at 2:19 am on Mar 22, 2011

Here is the one question I have:

Upload/Download Speed
Statement in Question:
"Staff would be asked to run the test the week prior to the end of the library’s PLS reporting cycle."

I’m confused by this specification. When I read this, I see that in Missouri I would be asking libraries to run the test the week prior to the end of their fiscal year. So I would have libraries on calendar year FY running it the week of Christmas (week 51), some week 26 that are on July-June FY, ones on October-September FY running it week 39… Or am I to read this that they should all run it on the state’s fiscal year (which for Missouri doesn’t coincide with other July-June states in the PLS)…

Sorry – I need more clarification. My preference would be that all libraries in the state run it the same week.

Carlos Manjarrez said

at 3:41 am on Mar 22, 2011

Katina,
Thank you for your comment. You have honed in on one of the issues that was a bit vexing for us. The "one week prior to the library's reporting cycle was an attempt on our part to standardize collection in a way that would be easy for individual libraries to remember. The idea was to set a speed test deadline that was one week prior to their deadline for reporting their data to the state (not one week prior to the end of their fiscal year), because we did not think it would be feasible to establish a single speed test date that would work for all states.

It would be good to know from others if this (someone arbitrary) request to have people conduct their test one week prior to submitting your data to the state causes problems in other states as well.

Thanks.
-Carlos

Michael Golrick said

at 4:29 am on Mar 22, 2011

Now I have a question related to Carlos' comment:

Is this a question we will need to ask, or is this something that we will ask each library to do prior to the state deadline for reporting? Or, my question could also be phrased as, do the libraries have to complete this test in order to be considered as completing the survey?

I will share Katina's concern since for almost all my libraries, the reporting period ends 12/31. That means people will be asked to remember to do the test the week between Christmas and New Year's. On the other hand, let me note that it is the definition that says "during a time designated by the state library" while it is only in the process section where the week before the end of the reporting period is mention. Perhaps there should be some consistency.

Oh...and since I am asking a question, let me express a concernt. Since this is at the <b>OUTLET</b> level, I have concerns about getting libraries to correctly enter the FSCS id numbers (consistently).

Carlos Manjarrez said

at 8:23 pm on Mar 31, 2011

Hi Michael,
Sorry about the delayed response. Regarding completion of the test, the collection will not be done through webplus so there will be a non-response edit check in the first collection. It is difficult to predict what the response rate will be at this time. Because it will take time to develop the collection web site there will be time for us (SDCs and IMLS) to fine tune the process of collection, incentives for participation and the timing of the test.

I am a little less concerned about the FSCS code issue as we could build an FSCS lookup into the collection website to make it easier for outlets to identify themselves.

-Carlos

Katina Jones said

at 6:50 am on Mar 22, 2011

Carlos, thanks for your response. So, to be sure I'm clear, the statement I highlighted could read instead:
" ... run the test the week prior to the end of the state's PLS collection cycle." Is that correct?

Thanks.
Katina

Carlos Manjarrez said

at 8:34 pm on Mar 31, 2011

Hi Katina,
Yes, it is correct (with an asterisk). As I mentioned to Michael, the collection web-site for this particular data element will not be in effect for the 2010 collection. I highly recommend that we (SDCs and IMLS) carve some time out of the next conference to fine tune these collection questions (should the data element pass this spring).
-Carlos

BrucePomerantz said

at 9:33 pm on Mar 23, 2011

Item #3: Public Access Hardware

IN ORDER OF INCREASING OBJECTION:
1) Small libraries probably don't keep line item distinctions between public and staff expenditures. My experience has been that the financial numbers are prepared by the city clerk or treasurer. A request that they keep the items distinct will be go unheeded.

2) John Bertot has informed us that this was the most unreliable data he collected. So, how reliable will the data be if we collect it? Is this going to be a matter of GIGO?

3) Making a distinction between staff and public equipment is akin to making a mind/body distinction. It can be done but one cannot operate without the other so why do it? If we want to track the increasing expenditures that libraries spend on hardware, then make it one complete amount, regardless of staff or public usage.

4) As libraries decrease in physical space because materials become increasingly placed in "The Cloud" and available/downloaded upon demand, perhaps we need three new expenditure categories (1) hardware (2) software and (3) licensing.

Laura Stone said

at 6:03 am on Mar 24, 2011

I asked some of our public librarians for comments to the questions (I didn’t provide the definitions), and the librarians commented that they don’t break out computer expenditures by staff and public. One librarian noted that her library has a four-year replacement cycle, but that can change if the budget is frozen.

You don't have permission to comment on this page.