|United States Postal Service®
Quarterly Performance for Standard Mail®
For Standard Mail® letters and non-Saturation flats, the Postal Service’s service performance measurement system uses documented arrival time at a designated postal facility to start the clock, and an Intelligent Mail® barcode (IMb™) scan by an external, third-party reporter to stop the clock. Mailpiece tracking from IMb™ in-process scans is used in conjunction with the external data to extrapolate results for this entire volume of Full Service Intelligent Mail® Standard Mail. Data collected by the Postal Service are provided to an independent, external contractor to calculate service measurement and compile the necessary reports. The system used for this reporting is called Intelligent Mail® Accuracy and Performance System (iMAPS).
The external contractor determines service performance based on the elapsed time between the start-the-clock event recorded by the Postal Service and the stop-the-clock event recorded by anonymous households and small businesses that report delivery information directly to the contractor. The service measure consists of two parts: (1) how long mailpieces take to get through processing, and (2) how long mail takes from the last processing scan to delivery. The second portion is used as a delivery factor differential to determine the percent of all Standard Mail® delivered on the last processing date versus the percent delivered after the last processing date. Service performance is measured by comparing the transit time to the service standard to determine the percent of mail delivered on time.
The Service Performance Measurement (SPM) application of the Full Service Seamless Acceptance and Service Performance system (SASP) serves as the data source for iMAPS. SPM captures data from all Full Service Intelligent Mail® and applies business rules for service measurement before sending data to iMAPS.
In November 2010, the Postal Service™ established a new certification process for all commercial mailers. Only pieces tendered by mailers certified as compliant and accurate were included in service performance measurement in FY11 Q1 through Q3. In FY11 Q4 the explicit certification of mailers was replaced by system changes that automated much of the certification process so that all Full-Service mail could be evaluated for compliance. All pieces that met service performance business rules were included in measurement beginning in FY11 Q4.
The service performance measure for DDU-entry Saturation flats involves the identification of major weekly Saturation mailings within delivery units. Delivery of these mailings is captured with a scan made by carriers at the completion of delivery of all pieces on the route. Service performance is measured by comparing the delivery date to the end date of the mailer requested in-home window to determine the percent delivered on time. Data from anonymous households reporting the receipt of these Saturation mailings are used to validate the accuracy of the carrier scans.
The service performance measure for Standard Mail® parcels with Delivery Confirmation™ is planned to serve as a proxy for measuring service performance for Standard Mail® parcels.
The following service performance results combine the results for letter and flats performance calculated through the iMAPS system with the proxy data to represent service performance for all Standard Mail®.
Data for FY12 Q2 were limited to mailers passing service performance business rules.
Due to limited automated processing for Standard Mail® flats, the service performance results are not representative of all Standard Mail® flats performance. While DDU-entry Saturation Mail has been included this quarter, significant gaps in the coverage of non-Saturation Destination Delivery Unit entry and carrier route presort volumes of Standard Mail® flats still remain because these pieces are often not processed on automated equipment and thus are excluded from measurement in accordance with current business rules and system capabilities for measurement.
In Quarter 2, results for Standard Mail® parcels, which represent less than one percent of total Standard Mail®, are not included in the overall Standard Mail® results.
Nationally Destination Entry mail achieved performance of 84.3 percent on time in Q2, with 98.6 percent delivered within the service standard plus three days. Portland Performance Cluster continued to lead the nation in Destination Entry performance with 96.2 percent on time. End-To-End national performance was 56.3 percent on time with 87.9 percent delivered within the service standard plus three days. Salt Lake City Performance Cluster continued to have the highest End-To-End entry score with 78.6 percent on time. Service performance results for the same period last year, FY11 Quarter 2, were not available for Standard Mail because no mailers were certified at that time.
|Percent Within +1-Day||Percent Within +2-Days||Percent Within +3-Days||Percent Within +1-Day||Percent Within +2-Days||Percent Within +3-Days|
|Capital Metro Area||91.8||95.3||97.0||64.9||76.1||84.0|
|Greater South Carolina||97.4||98.8||99.4||79.4||87.5||92.3|
|Western New York||93.5||96.8||98.2||67.7||79.5||87.5|
|Great Lakes Area||94.3||97.6||98.8||67.1||78.8||86.8|
|Northern New England||92.5||97.0||98.6||67.2||79.4||87.3|
|Northern New Jersey||95.7||98.3||99.1||67.8||79.7||88.7|
|Salt Lake City||97.4||98.9||99.3||87.1||92.3||95.4|
|Nation FY2012 Q2||94.2||97.4||98.6||70.9||81.2||87.9|
|Nation FY2011 Q2 (SPLY)||-||-||-||-||-||-|
|Nation FY2009 Annual||93.4||96.4||98.0||78.1||85.1||90.0|
|Nation FY2010 Annual||92.3||96.0||97.8||68.8||75.8||80.7|
|Nation FY2011 Annual||86.5||93.2||96.2||53.9||67.1||77.1|
|Nation FY2012 Q1||82.9||90.4||94.2||50.7||63.2||72.9|