<![CDATA[Blank Title - The Blitz Blog]]>Sun, 12 May 2024 09:51:01 -0700Weebly<![CDATA[Supplier Scorecard]]>Wed, 01 Dec 2021 19:46:35 GMThttp://vonblitzdesign.com/the-blitz-blog/supplier-scorecardvon Blitz Design’s first major project, the Supplier Quality Performance Summary (a.k.a. Supplier Scorecard) was custom built in the Plex Manufacturing Cloud using the platform’s native SQL Development Environment (SDE) and VisionPlex (VP) user interface rendering engine.

It reports 49 distinct quality and delivery metrics via as many as 12 panels.

PictureSupplier Quality Performance Summary

APPLICATION UPGRADE ALERT
FILTER PANEL
SUPPLIER INFORMATION
 

QUALITY PERFORMANCE SUMMARY
 

QUALITY MNCR SUMMARY
 
 
 

QUALITY PPM CHART
 
 
QUALITY INDEX POINTS CHART
 

QUALITY RANKING CHART
 
 



​QUALITY PERFORMANCE PARETO
 
 
 
 
 
 
 

DELIVERY PERFORMANCE SUMMARY
 


​ 
DELIVERY INCIDENTS


SQPS Overview


Some data is gathered by queries of receiving and supplier returns; most is derived from Problem Control system data logged using various custom problem forms such as the “MNCR” (Material Non Conformance Report), Acknowledgement of Blueprint, and Delivery Incident Report.

No monthly bulk upload of information is required, though such an approach for displaying third-party data could potentially be incorporated. Queries that drive each panel run in parallel, with data rendered as each panel’s execution completes.

Analysis is delivered via line and pareto charts, with extensive use of conditionally formatted background coloration (heat mapping) for at-a-glance evaluation of important conditions, trends, and action items.

A standard filter panel is at top of screen. In the example below, all raw material suppliers have been filtered using the Supplier Code multi-select picker, allowing direct comparison of any number of vendors across any range of historic periods spanning up to 12 months.

The first data grid is the Supplier Information panel which shows basic reference information for the filtered suppliers. Of particular note is the “IATF Certification” column. 
Picture
Filter panel and Supplier Information grid

​If the user is a vendor viewing this screen via the supplier portal, Supplier Code is read-only and locked to that user’s supplier. Portal users cannot see or check the “Internal SQM” filter, which toggles visibility of multiple supplier details and information not flagged as “Visible to Supplier”.

Filters such as “Hide Quality Metrics” and “Hide Delivery Metrics” are provided because some personnel using this screen are focused on only one of these two aspects of supplier performance and can thus easily tailor the screen to their needs.

Deselected choices are not only hidden but are not executed, thus conserving computational resources.

The Supplier Quality Performance Summary grid is the heart of the report.

​Here are displayed all the quality metrics, and monthly scores and cumulative scores for each. Crosstab display of Period columns is dynamic based on the “Period Range Option” and “Period Range” filter values, which provide ranges of 1, 3, 6, 9, and 12 months, Calendar Year, and Fiscal Year. Any desired custom “End Period” can be specified.
Picture
Supplier Performance Summary grid

​It’s important to be aware that virtually all of these metrics (except “Parts Received” and “Supplier Count”) are “demerits”, so low scores are best and superior suppliers will have many empty cells – empty rather than displaying “0”, which is an ergonomic design choice to minimize clutter and eyestrain, making it easier to identify who earned demerits and when.

“Ranking” is calculated by comparing index points accrued by all suppliers under consideration; color shading is based on the supplier’s ranking as a percentile among that “pool” of suppliers.

When the report is filtered on a single supplier, the pool is all suppliers having the same Supplier Type as the single supplier that's been filtered. If multiple suppliers are filtered, they are all shown and their performance on each metric can be directly compared. In this case, ranking is between suppliers selected in the Supplier Code multi-picker only.

Picture
Supplier Performance Summary grid (multiple suppliers displayed)

​If the user prefers, scores for each supplier can be displayed contiguously by supplier instead of by metric. Using the standard UX column-based sorting feature, simply click the Supplier Code column header: 
Picture
Supplier Performance Summary grid (sorted by Supplier Code)

The next grid, Supplier Quality MNCR Summary, is a detailed log of problem records related to the filtered supplier(s) during the filtered period(s). Points are assessed based on problem severity and responsiveness. 
Picture
Supplier Quality MNCR (Material Non Conformance Report) Summary
The multi-select “Problem Form” picker filter and checkbox filters “Show Open Problems Only” and “Show Scored Problems Only” help the user focus on problems in need of attention, since many problem records don’t generate demerit points or will have already been resolved.

​The markup below illustrates how problem severity and response scores contribute to various quality metrics reported in the Supplier Quality Performance Summary grid. 
Picture
Quality Performance Summary / MNCR Summary Correspondence
Note the Late Response Time scores of 320 and 660 in periods 2021-03 and 2021-06 respectively appear in the performance summary grid but are not displayed in the MNCR summary.

This is because filters applied to limit presentation of problem records do not exclude their demerit points from the summary for scoring purposes.

Some problem types are for internal reference only; unchecking the “Internal SQM” filter hides them. These problem forms are defined as “Not Visible to Supplier” and are never displayed to portal users.

Appropriate use of filters can significantly improve screen performance.


The screen next displays a series of 3 Quality Charts:
  • Supplier Quality PPM
  • Supplier Quality Index Points
  • Supplier Quality Ranking 

Picture
Quality Charts
Again, keep in mind scores are demerits and therefore downward trends indicate improvement. An ideal supplier will have flat lines at 0 for PPM (Parts Per Million rejected) and Index Points, and will have monthly and cumulative rankings of 1.

Note a particular supplier’s ranking score reflects not only that vendor’s performance but is also influenced by how well other suppliers in the pool perform over the same period range.

If multiple suppliers are filtered, the PPM and Index Point charts display aggregated data for the selected suppliers, and the Ranking chart is automatically hidden.

Charts automatically scale based on data.

​Note the yellow, orange, and red color thresholds displayed on the PPM and Index Point charts. If no charted values exceed “Yellow Threshold”, Orange and Red Thresholds are not displayed. If no charted values exceed “Orange Threshold”, Red Threshold is not displayed.

Picture
Quality Charts - Selective Display of Color Thresholds
In some cases, the displayed numeric values of plotted points can obscure each other. A standard feature of the chart object mitigates that problem: by hovering over a particular legend, that series is highlighted by fading all the others.
Picture
Chart legend hover feature

The Supplier Quality Performance Comparison provides pareto analysis of the supplier pool based on the final cumulative index point score.
Picture
Supplier Quality Performance Comparison - Pareto Analysis

​The Supplier Delivery Performance Summary is comparable to the Quality Performance Summary, but presents data logged using the Delivery Incident Report custom problem form.
Picture
Supplier Delivery Performance Summary
Picture
Excerpt from the Delivery Incident Report custom problem form
Naturally there is also a Supplier Delivery Incidents grid panel with details of each problem record:
Correspondences of Delivery Incident Report data and the Summary Grid (please disregard effects of test problem records that ended up with dates not in order of entry):
Picture
Supplier Delivery Incidents / Summary grids correspondence

​Additional features:


User-Specified Configuration
PCN (tenant) -specific metric descriptions, default sort orders, visibility, and value-driven cell background colors are user-controlled via the standard Plex Supplier Scorecard Metric Setup application: 
Picture
Supplier Scorecard Metrics Setup excerpt
Range color data configured here is also used to define the Y-axis position of thresholds shown on the PPM and Index Points charts.

Application Upgrade Alert
​This panel is normally hidden but can be used by the screen’s developer to advise users of work in progress or recent updates. When activated by the developer it appears at the very top of the page, above the filter panel.
Picture
Application Upgrade Alert

​Application Release Notes
The “Show Release Notes” checkbox filter toggles display of information which is normally hidden, but can be of great value to a user seeking better understanding of what the screen is showing. Application Release Notes is a dedicated panel showing the change log written by the screen developer:
Picture
Application Release Notes grid

Metric Comments
“Show Release Notes” also toggles the visibility of a column at the far right of each Performance Summary grid. This information is populated by the screen developer and precisely describes the technical method by which each metric is calculated, thus helping to answer the question, “Where do these numbers come from?”
Picture
Metric Comments

Development History

This dashboard was developed for FCC (Indiana) LLC, a Tier 1 supplier of automotive clutch components and assemblies, and long-time subscriber to the Plex Manufacturing Cloud.

I'll say nothing but good things about FCC, who has been a dream customer. Michael Hunt and Diane Parr are a joy to work with and I’m fairly confident the feeling is mutual…

“John, EVERYTHING I have seen thus far from your work is nothing short of amazing!”
​--Michael Hunt, ERP Systems Manager, FCC

Michael approached me in December of 2020 at Diane’s recommendation, with the prospect of several potential projects. The first would be a custom supplier scorecard serving both internal Supplier Quality Managers (SQMs), and vendors accessing Plex via the Supplier Portal. The concept was well-considered and well-described by mockups and sample metric calculations presented in a PowerPoint deck.

In less than a month the basic design with several grids and 3 charts was up and running, and we’ve been refining and making it more powerful ever since. Most recently we’ve added delivery metrics to augment the original suite of quality scores, and authorized users can now directly control PCN (tenant or business unit)-specific configuration of metric descriptions, visibility, sort order, and dynamic “conditional formatting” cell background coloration via the standard Plex Supplier Scorecard Metrics Setup application.

Because FCC has completed transition from Classic to the UX platform, this was a UX/VP build; but it’s worth noting a comparable dashboard could be built in Classic/VP and that a common custom SQL code suite could drive both a Classic/VP dashboard and the UX/VP screen to be used after transition.

Also keep in mind this is just one possible approach.

The beauty of custom reporting is that you could have something very different.

In certain respects, what you could have is limited only by your imagination.

​So far on this project I’ve been able to rapidly deliver virtually everything requested – and more.


Conclusion

A good Supplier Scorecard is a powerful tool for developing your supply chain and continuously improving the quality of products both purchased and produced. Such commitment to quality is “table stakes” in today’s competitive environment and has a profound and immediate impact on the bottom line – especially if it’s absent.

By monitoring performance in near real time, Supplier Quality Managers (SQMs) can detect issues early, implement corrective actions, and verify actions are having the desired effect. Vendors with (appropriately limited) access to the same data via a “supplier portal” can see how their customer thinks they’re doing, and how they compare to other suppliers against whom they may be competing. Having this powerful feedback mechanism just a few clicks away leads to rapid, proactive, and continuous improvement efforts by the vendor.

​A supplier scorecard is particularly suited to the Plex SDE + VP technology stack because such historical reporting is not significantly impaired by report server replication latency.


John Perry Dancoe
Principal Consultant, von Blitz Design LLC
Lake Orion, Michigan, USA
2021-12-01
]]>
<![CDATA[The Blitz Blog is coming!]]>Wed, 11 Aug 2021 20:47:53 GMThttp://vonblitzdesign.com/the-blitz-blog/the-blitz-blog-is-comingGet tuned!]]>