More Product, Less Process: What It Is, Why It Took Off, and When It Works
If you have ever walked into a storage room full of boxes that no one can access, you already understand the problem MPLP was trying to solve. “More Product, Less Process,” published in The American Archivist in 2005 by Mark A. Greene and Dennis Meissner, became one of the most influential and controversial articles in modern archival practice because it named something archivists were living every day: processing could not keep up with acquisitions, and backlogs were becoming the default condition.
At Backlog, we spend a lot of time inside that reality. We come in when the backlog is heavy enough that staff cannot answer basic questions. What do we have, where is it, and can anyone use it. MPLP matters because it offers a way to think about that situation without pretending you can process everything to an ideal standard before anyone gets access.
The problem MPLP put on the table
Greene and Meissner were working with large repositories and described an issue that many archives still face. New acquisitions arrive faster than staff can process them. Boxes accumulate. Collections sit in limbo. Researchers cannot discover or request materials because there is no description, no finding aid, and sometimes not even a usable box list. The backlog grows, and the longer it grows, the harder it becomes to reduce.
They argued that the profession had developed processing habits that exacerbated the problem. The list is familiar: painstaking removal of fasteners, extensive rehousing, very granular arrangement, item-level weeding, and item-level description. None of these actions are inherently wrong. Their point was about scale. These practices made sense in certain contexts, but they became the default even for large modern collections where the payoff was limited and the time cost was enormous.
The core recommendations, in plain language
MPLP is not a manual. It is an argument for a different baseline.
The authors pushed archivists to stop treating meticulous preservation steps as the automatic first move for every collection. They claimed that in climate-controlled storage, some common risks such as rust from fasteners and paper deterioration progress more slowly than many archivists assume. They argued that preservation efforts should focus on overall holdings maintenance rather than micro-interventions at the folder or item level, especially when the consequence of perfection is years of inaccessibility.
They also challenged a common processing instinct: the notion that archivists must describe everything at a granular level before opening collections for use. Their phrase “research is done with a shovel, not tweezers” became well-known for reframing the archivist’s role. Researchers often want broad access to materials and the ability to excavate. The finding aid needs to be accurate, clear, and useful, but it does not need to anticipate every research question through item-level description.
Privacy and sensitive material were another flashpoint. Greene and Meissner argued that privacy concerns were often overstated and that exhaustive review for sensitive content could become a barrier to access. They were not saying privacy does not matter. They argued that the way archivists handled privacy could become a blanket justification for keeping materials closed indefinitely.
Underneath all of these points was the heart of the article: establish a “golden minimum” for processing and description so that access becomes the priority, not an aspiration postponed forever.
The metric that got quoted everywhere
One reason MPLP traveled so far is that it offered numbers. The authors proposed a benchmark of roughly 400 linear feet per year for a full-time manuscripts processor, based on practices they surveyed. That benchmark worked its way into grant conversations and staffing discussions. It became a shorthand for efficiency, and also a source of frustration because the article did not provide a universally satisfying definition of what “processed” meant at that speed.
That tension still shows up today. Metrics can be useful. They can also be weaponized.
Why the article became a fight
MPLP is controversial partly because it landed in an environment where archivists were already being told to do more with less. Some people read it as a rallying cry for reducing barriers to access. Others read it as professional permission to cut corners.
Criticism also focused on the article’s tone and the context behind its conclusions. The language was provocative, and some responses argued that its methodology leaned toward large university repositories, overlooking how processing realities differ across museums, historical societies, corporate archives, and community-based collections. Others felt the article minimized the intellectual labor of arrangement and description, as well as the slow care that protects collections over the long term.
The responses became a literature of their own. Carl Ness’s “Much Ado about Paper Clips” questioned whether savings from eliminating small preservation tasks would actually translate into meaningful capacity. Jessica Phillips later strongly argued for the continued importance of preservation work and the interdependence of preservation and access. Greene and Meissner continued to respond over the years, clarifying their meaning and pushing back on interpretations they considered inaccurate.
If you have ever watched archivists argue about MPLP, you have seen a proxy debate about identity. Are we craftspeople? Are we service providers? Are we scholars? Are we risk managers? Most of us are some mix, and MPLP poked that nerve.
What MPLP actually argues, once you strip away the noise
MPLP is easiest to apply when you treat it as a way of thinking rather than a rulebook.
It argues for focusing on useful aggregations instead of individual items as the default. It argues for a minimum standard that gets collections discoverable and usable faster. It argues that a large backlog should be more embarrassing than leaving some paper clips in place. That last line is intentionally blunt, but the point is practical. If the choice is between a perfectly processed collection and thousands of feet of hidden material, MPLP says the profession needs to rethink what it is rewarding.
It also recognizes something that gets lost in the caricature. Some repositories were already doing MPLP-like work long before 2005. The article did not invent fast processing. It gave it a name, pushed it into the center of professional conversation, and challenged the assumption that slower always equals better.
Where MPLP is valuable in real workflows
In Backlog projects, MPLP is relevant when the goal is to reduce the backlog without sacrificing the essentials: intellectual control, basic preservation, and clear access decisions.
MPLP works well when materials arrive in decent original folders and are stable enough to be stored safely with minimal intervention. It is useful for large 20th and 21st-century collections where box-level or series-level description can unlock immediate research value. It is useful for institutional records that align cleanly with retention categories or office functions, where the goal is often access and accountability rather than perfect item-level description.
It can also support approaches like accessioning as processing, where the initial intake work is structured enough that materials do not disappear into a backlog for a decade. It aligns with extensible processing, where you start with a realistic baseline and then deepen work on the collections that prove high-use, high-value, or high-risk. It can also connect to community description models, such as social tagging, where controlled description is supplemented by user-generated language over time.
MPLP is not a good excuse for neglect. It does not replace preservation when preservation is truly needed. It does not make privacy and restriction decisions disappear. It does not mean every collection should be opened fast with minimal review. At Backlog, we have seen enough messy environments to know that some collections demand careful stabilization before access is realistic. Climate control helps, but it is not a magic shield, especially with increasing disasters and infrastructure failures.
There is also a real operational tradeoff. If the description is too thin, the time cost can shift from processing to reference. The work does not vanish; it moves. MPLP is most effective when the minimum description is still genuinely useful. Not vague. Not aspirational. Useful.
A Backlog way to think about MPLP
When we are helping an organization build a processing approach, the question is not “are we MPLP or not.” The question is “what level of processing gives us the most access for the least risk?”
Sometimes that means stabilizing and rehousing because the physical condition poses a risk. Sometimes it means a fast baseline inventory so staff can stop guessing what is in storage. Sometimes it means pushing item-level descriptions into a second phase only for high-use, high-profile, or legally sensitive collections.
MPLP is valuable because it gives archivists permission to state openly that delayed access for perfection is not neutral. It has a cost. It affects researchers, donors, institutional memory, and the archive's credibility.
If you are staring at a backlog and feeling stuck, MPLP is not a magic solution, but it is a useful lens. It helps you define what “good enough” looks like for this collection, in this environment, with the staff and resources you actually have. That is where progress starts.
Want to learn more? Watch our webinar on MPLP here: