Category: DAM Education

The Concept of Trust in Your Workflow

This article was originally published on the DAM Coalition website, a property of Pro Video Coalition. As DAM Coalition was decommissioned in early 2015, this content was moved with permission.

by Nick Sincaglia

I was talking with a client recently about some of the challenges they were experiencing with a new workflow they had implemented. It was a bit of a departure from their existing workflows in that it relied much more heavily on external data providers. Until this point, their existing workflows were primarily driven by data that was originally from internally created data for which they had defined strict guidelines and relied heavily on throughout their workflow systems.

This client had expanded their operations into a new area of business which required them to rely on their outside data supplier partners to a much greater extent for the quality and consistency of the data which entered their systems. Needless to say, these new data streams were experiencing much higher failure rates and, as a result, were stressing their workflow systems in ways they had not experienced before.

The failures were causing a “log jam” of media files that would quickly fill up the workflow storage reserves and clog parts of their IT network, due to the re-routing and reprocessing of these files. A cascade of completely new workflow challenges had caught the organization off guard and was causing a significant amount of stress on their operations.

While talking through these problems that the client was experiencing, the concept of “trust” surfaced in our conversation. Since my background in workflow development has always tended to involve at least some amount of external data integration, “trust” was a concept I had deeply internalized and always included in my workflow designs.

It was through this conversation that I realized that building this concept of “trust” into my workflow designs was something that slowly evolved over time from experience. This client’s existing workflows had made assumptions regarding “trust,” which turned out to be no longer valid when applied to the external data they were receiving.

Trust in Workflow Design

So what do I mean when I refer to “trust” within the workflow design? Trust in this circumstance is the level of confidence one has in the quality and consistency of the data that’s being used as an input to any component within a workflow system. Trust can be applied to your business partner’s data feeds, your employee’s work quality, a server, the network, software applications, hardware devices or sub-components within your workflow systems. Each link in your workflow chain can be assigned a value of trust that you can use to drive your workflow designs and increase the robustness and reliability of your workflow automation.

And as the saying goes, “trust should not be just given, it must be earned.”

This was the mistake my client had made. Their confidence that these new data feeds would have the same quality and consistency as their internal data feeds was set too high. As a result, their workflow systems were stressed in ways they did not anticipate, and now they needed to re-evaluate how they did things.

So, how does one build the concept of trust into one’s system design?

I have learned to develop profiles for the various components that make up my workflow systems. There are no hard and fast rules as to the level of granularity one must establish for these component profiles. It really depends on your situation and the level of detail you want to track, monitor or control.

Each of these component profiles may also include data level profiling. The idea is that you assign a trust ranking to each data input component that can be used to programmatically make decisions on how to process the data.

For example, if your data is coming from a workflow component that has a high ranking of trust, and you receive an update to a previously delivered data record, you might programmatically design your workflow systems to automatically accept that updated data record. If, under the same scenario, the data is coming from a low trust ranking data component, the data update might be further scrutinized by running consistency checks against other data sources, or by routing it to a manual process for human review. Each component within your workflow is designed to check an input’s trust ranking before processing the data and, in doing so, might receive instructions on how to process the data.

Trust as a Variable

Trust ranking is not a bulletproof way to guarantee that your workflow systems will not inadvertently allow low quality data to slip through. There will always be data quality issues that will surface that are unanticipated. However, this approach, if designed properly, will enable one to expand the granularity of these data quality checks and decision-making responses over time. Remember, the data quality of your workflow systems is not static; your data suppliers and workflow components might change over time, and their level of trust might rise or fall.

Before a workflow component starts to process the data it receives, it can be designed to quickly check who the supplier of the data is and what other sub-components had previously processed the data. From there, it can be instructed on what data elements should be scrutinized during the processing of the data.

At the same time, this trust ranking concept should not unnecessarily impede the data flow through your systems. One needs to balance the need for total data quality with the rate at which data must flow through the system. In most workflow systems, it is unacceptable for the output of any workflow component to fall too far behind the data rate of its input.

One situation where I found the greatest need for this trust ranking concept was when I was working with systems that mixed data associated with content from the opposite spectral range of the “Long Tail.” The concept of the “Long Tail” was made popular by Chris Anderson’s book “The Long Tail: Why the Future of Business is Selling Less of More.”

Valuing the Tail End of the Long Tail

One side effect of the long tail that I noticed in my work was that the quality of the metadata degraded the further down the long tail one went. I don’t remember Chris Anderson discussing this “feature” of the long tail in his book. From my perspective this was the “dark side” of the long tail that the book failed to mention.


Digital assets that are less popular typically don’t receive the same level of metadata attention more popular assets receive.

In a typical marketplace, there are a number of parties involved making content available. Typically there are creators, manufacturers, distributors and retailers. Even in a purely digital environment, these same roles tend to persist.

The “dark side” of the long tail is the further down the long tail one goes, the less incentivized one is to spend time on metadata quality and consistency issues. Time is money as the saying goes. The fewer copies of the content one expects to sell, the less likely one is to earn back the money invested in the content’s metadata.

If the creators of the content do not supply high quality metadata with their media, the responsibility of doing so is passed to the manufacturer. If the manufacturer does not have the incentive, they will pass the responsibility to the distributor. If the distributor lacks incentive, the responsibility continues on to the retailer. And if the retailer is not motivated to cleanse the metadata, it will simply get passed to the consumer.

So, when you are mixing metadata associated with both front tail content and long tail content, the concept of trust plays a very big role in how you design your workflows. Professionally produced content tends to have much greater metadata quality, because the suppliers of the content have a vested interest in making sure the content is properly prepared so that it can be easily received and processed by each party within the retail supply chain and on to the consumer for purchase. The opposite tends to be true for long tail content, for the simple fact that every minute spent addressing content metadata issues, the lower the probability one will make back the money spent in doing so.

If you think about it, this same situation exists in almost all content environments. Even in your company’s internal content systems, though perhaps not to the same extreme. There will always be high value content and low value content. Too much time and effort spent on the quality and consistency of low value content could result in a net loss to the organization.

Your organization probably already internalizes this reality in the way they run their business by putting more effort into the quality and consistency of the data surrounding their high value content. And if you think about it a little further, shouldn’t your workflow systems also be able to internalize these same concepts?

Nick Sincaglia

Nick Sincaglia is President/Founder of NueMeta LLC, a consulting firm focused on digital asset and metadata management strategies. Nick’s company provides software development and system design consulting services for the leading digital media & entertainment companies. Nick has been active in several industry technical standards bodies as a Board Member and Working Group Chairman for the Digital Data Exchange (DDEX), NARM’s Digital Think Tank, and member of Metadata Working Groups sponsored by the European Broadcast Union and Audio Engineering. Nick has been a member of DAM Guru Program since 2013.

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Required Skill Set For A Digital Asset Management Department Lead

This article was originally published on the DAM Coalition website, a property of Pro Video Coalition. As DAM Coalition was decommissioned in early 2015, this content was moved with permission.

by Nick Sincaglia

Having spent over 15 years in the field of Digital Asset Management, both as a consultant and as a staff member of some of the leading media and entertainment companies, I am frequently asked to help define the skill set required to lead an ongoing digital asset management initiative. Sometimes, I am even asked to assist in helping find the people to fill this role.

What skills should the leader of a digital asset management team possess in order to operate and maintain these types of systems, and enable the organization to maximize its full potential? It is an important question to ask and even more important for the organization to get right.

I think the question is a challenging one to answer, due to the fast pace at which the digital media and technology industries are evolving. The skill requirements have grown over the years as the focus on the digital business has increased and departments and budgets have expanded.

I will express my opinions in answering this question based on my own experiences, but I know that not everyone’s DAM experiences are the same. I welcome your feedback and would be interested in hearing what you most value when selecting a leader for your digital asset management teams.

DAM as a Department

I think it is important for the leader of your DAM department to possess four main skills. Before we go into each of those, I want to point out that I deliberately used the term “department.” I have seen many companies try to tuck DAM into an existing department within their organization and, generally speaking, it never works very well.

DAM is unique. It is its own kind of animal. It involves a little bit of a lot of things, such as software engineering, database design, operations, licensing, product design, account management, etc. But DAM is not enough of any of these things for it to make sense to fold it underneath any one of these headings.

DAM really needs to be considered its own department that works closely with each of the other departments in the organization, but is viewed as its own discipline, with its own resources, release schedules and budgets. Recognize this early and you will avoid a number of problems, and you won’t inadvertently set up the members of the DAM team for failure.

DAM Skill #1

The first skill on my list for a DAM department lead to possess is strong workflow management and troubleshooting skills.

There are a number of formalized quality analysis techniques and methodologies used to describe the recursive nature of evaluating the operational components that make up your workflow systems to determine the quality, efficiency and identify deficiencies. They include Six Sigma, Lean, Pareto Analysis, etc. But it is not the method itself or the “certification” in any of these methodologies that is important. What is important is that the person is capable of analyzing complex and interconnected systems, and for them to be able to think in systematic way so that they can put into place ways in which to monitor these systems to recognize when a system is not running optimally, when failures are occurring or optimizations need to be added.

What I am trying to describe here is not the obvious IT system failures, like network outages or catastrophic software crashes. (Although dealing with these events are a part of the job as well). What I am trying to describe is much more subtle.

There is an art in working in an environment where software interacts with large volumes of complex metadata. Workflow systems are designed around the data they are expecting to receive. But what happens when the software encounters data that is completely unexpected? Results vary depending on the design of the software and systems.

You can never 100% predict the range of data variations that your systems will receive. The best you can do is to try to put defensive barriers in place that will recognize when data exceeds the expected norm, and either tries to auto-correct it or routes it through to an error processing pipeline to be manually reviewed. The intelligent algorithms designed to make these complex workflow-processing decisions must be constantly re-evaluated and tweaked to handle newly discovered data variations that enter the system so that the number of manually reviewed records decreases over time.

One can never expect to completely eliminate the need for manual reviewing of failed metadata processing. Some poor quality data will slip past your defense systems and find its way into your production systems. When this happens, it is equally important for this person to be able to put into place a means to locate and correct any of these data issues before they have any significant affect on your data driven business systems.

Because these software/data issues don’t typically result in catastrophic errors that take down your systems, they require one to pay careful attention to the system outputs of the various workflow components, and the interactions between them, in order to recognize when the results are deviating from the norm. Many times, these problems do not occur regularly and may only manifest themselves under certain conditions involving multiple inputs.

What kind of person would have the well-honed skills to be successful in this role? I don’t think one can say there is a single mold in which one must fit, but I will say that experience is critical.

Well-honed troubleshooting and analysis skills are not something one can acquire in a certification course or weekend workshop. Neither are these skills something in which one can acquire proficiency by reading a book or preparing for a certification exam. The real world can present challenges that can be enormously complex, especially within this era of “big data,” in which more and more organizations are finding themselves.

I may be a little biased in my opinion on this, but I would lean towards individuals with a strong engineering background. The reason I say this is because engineers are typically trained and well practiced in the art of problem solving.

It is said that in order to be an expert in a subject, one must practice a cumulative of 10,000 hours on that subject. The job of a typical engineer is to solve problems and devise solutions. Overtime, their thinking becomes oriented towards looking at complex situations, breaking them down to smaller and smaller sub-components, and thinking about how to test each sub-component in order to determine the cause of the problem. I won’t say that engineers are the only ones who possess these skills, but I think engineers typically have more opportunities, both in school and in their work life, to hone these types of skills.

DAM Skill #2

A second area of focus, which I think is a very important trait to have in the field of Digital Asset Management, is to have a passion for metadata modeling.

I use the term “passion” because, let’s face it, metadata is not particularly sexy or exciting, but it is critically important to the success of your DAM systems. If you can find someone who has strong opinions in this area, that person is likely to have spent time studying the subject.

I firmly believe that with DAM systems, architecture really, really matters! A significant part of any DAM system architecture is the way it captures and stores the metadata that describes its digital assets. “Content is king,” as the saying goes; but that content does not exist if you can’t find it! In the world of DAM, it could be easily argued, “context is king.” If you take this to heart, you will recognize how important metadata models are.

In fact, I view the processing of the media files as the easy part of DAM; it is the metadata that is the hard part. You need to either leverage existing metadata standards or build your own data models to accurately represent the needs of your business. Then you must figure out how to accurately capture this data and protect it from being corrupted over the life of the content.

Metadata design is more than just accumulating a list of data fields used to capture metadata. There are real design considerations and industry expertise one needs in order to develop a metadata model that will last the test of time and grow with your organization’s needs.

Also, don’t forget that we operate in an increasingly connected world, so making your metadata models interoperable with both internal and external parties is becoming increasingly important. Having someone who has both strong data modeling experience combined with in-depth knowledge of other available data modeling options is essential, in my opinion.

DAM Skill #3

Another trait that I feel is necessary for a DAM department lead, and cannot be ignored, is a strong background and interest in new technology.

The pace of change in the field of technology has become truly exponential. DAM is not just about technology (remember we started our discussion on workflow); however, the right use of technology can make the difference between working harder and working smarter.

In just the last few years, I have seen a five- to ten-fold increase in efficiency with some newer technologies over older tried and true techniques and methodologies. This can have a huge impact on the budget and resource requirements needed to build and maintain your DAM systems, not to mention your competitiveness in the marketplace.

A genuine curiosity for new technologies and new approaches to old problems, combined with a healthy skepticism and the ability to evaluate the trade-offs between the new and old approaches, is important.

DAM Skill #4

The last skill that I think is important in a DAM department lead is a good understanding of intellectual property rights.

One does not need to have the background of an IP lawyer; however, the plain reality in the Digital Asset Management business is that one rarely owns 100% of the rights under all circumstances to the content they manage.

Most content has multiple parties with claims to and interests in the intellectual property contained within those digital files. Those interests are outlined in contracts with all sorts of conditions and limitations. Understanding the true meaning these contractual clauses have on your digital asset catalog is critical. You should expect your business to change over time. Having someone who understands the implications these changes can have on the use of the digital assets, the royalties owed, and the legal risks, will be critical to your business.

By no means is this a complete list of skills. But from my experience, these are four critical skills that can significantly impact your DAM and your business. If you can find someone with this wide array of skills and experience, you would be certainly off to a very good start.

Nick Sincaglia

Nick Sincaglia is President/Founder of NueMeta LLC, a consulting firm focused on digital asset and metadata management strategies. Nick’s company provides software development and system design consulting services for the leading digital media & entertainment companies. Nick has been active in several industry technical standards bodies as a Board Member and Working Group Chairman for the Digital Data Exchange (DDEX), NARM’s Digital Think Tank, and member of Metadata Working Groups sponsored by the European Broadcast Union and Audio Engineering. Nick has been a member of DAM Guru Program since 2013.

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Learn Controlled Vocabularies logo

Visit for valuable DAM-related resources and a user forum with more than 1,000 members. is a wonderful resource where you can learn more about the art, science and benefits of using controlled vocabularies in DAM. In addition to a wealth of smart information written by site owner, David Riecks, is also home to the Controlled Vocabulary Discussion Forum, in which more than 1,000 professionals from across the globe share ideas and information. Accounts are free and discussions are moderated, ensuring that the forum remains free of spam and other nonsense.