Administrators of sites that launched onto LH3 prior to January 1, 2013 may notice a significant drop in the number of downloads reported this year as opposed to in 2012.  This may be the case even if your overall traffic increased.  After analyzing these variations, we believe the decrease is a result of changes in usage reporting methods between 2012 and 2012, as well as a difference in the structure of the reports.

Changes in reporting methods

In 2012, reports were generated from web server traffic logs, as we did not have a method available through Google Analytics at the time to track downloads. In reviewing the 2012 reports, we believe this method resulted in some double-counting of attachment modules by reporting on both the attachment’s unique URL, and the resource landing page. We also saw a spike in traffic from search engine and other bots in 2012, as many new LH3 sites came online that year and were re-indexed by these external sites as a result.

This year, we utilized Google Analytics, rather than server traffic logs, to provide 2013 usage reports to the community.  The 2013 numbers show, on average, increased  usage from the LH2 usage reports, however decreased usage from last year’s LH3 reports.

Changes in report structure

The 2012 reports broke out views and total resource downloads. That is, views of the resource landing pages and downloads of attachments as necessary.  We also included the module type.

The 2013 reports break out views and resource downloads by specific attachment type to allow you to better understand and filter for usage on specific types of resources.  For definitions of each column in this year’s reports, see our January 31 blog post.

For LSC GAR reports, some states have used LawHelp’s “usage” or “views” stat for the “download” metric, while others only report on usage of certain attachment types. If you have questions about how interpret this year’s report to be consistent with how you reported in 2012, or other years, please let us know.

We understand the changes in reporting methods between 2012 and 2013, and potential for over-counting in 2012, may impact your own reports to funders and others. If there is any additional language or support we can provide to help make these changes clear to your stakeholders, don’t hesitate to contact us at

We also appreciate the positive feedback folks have shared about the clearer structure of this year’s reports, and anticipate making additional improvements this year to expedite access to this data for the community.