Category: DAM Education

Leveraging In-house Expertise for DAM

by Collin Rickman, MLIS

You drum your fingers against your desk, nervously anticipating the news you’ve spent years waiting for: The latest iteration of your organization’s digital asset management strategy—an elegant and complex system of people, processes, and technology crafted for machine-like efficiency—has gone live.

One final time, you race down a well-trodden to-do list in your head, sifting for possible weaknesses. Are the user guides comprehensive, yet concise? Is our governance policy strong enough? Are our DAM cheerleaders dispersed strategically and ready to make waves at a moment’s notice? How is everyone’s internet connection looking today? Taxonomy tight? Metadata managed? Interface intuitive?

But those thoughts vanish as your inbox explodes with accolades, and your phone starts ringing off the hook. A grin dances on your lips. Sinking back into your plush leather chair, you laugh hysterically and pour yourself a glass of champagne—no, two glasses! When in Rome. Your work is finally finished!

Or not.

DAM is a living, breathing organism—a constantly evolving concept that demands the utmost attention from its practitioners. Long after the confetti is cleaned up and integrators have moved on to new projects, things inevitably change or go wrong.

Maybe your team supports clients with business needs that change often, and you need flexibility to rapidly meet and support those needs. Perhaps your organization is acquired by another, and your user base multiplies overnight, requiring increased capacity and performance that your system doesn’t possess. What is the solution when a critical server fails on the eve of an important deadline day, delaying access to important assets at the worst possible time?

While the answers to these questions vary from organization to organization, and depend on whether your DAM solution is self-hosted, SaaS, or a hybrid of some kind, chances are high you will be acting as a liaison by leveraging in-house expertise to meet your team’s technological objectives. The synthesis of stakeholder needs, IT department objectivity, and the practical realities of your day-to-day DAM operation can be challenging alchemy.

Luckily for you, librarians (and this writer) have a few tips to offer:

1. Translation. A typical librarian, like a DAM Manager, has expansive reach into many areas of an organization, as their jobs require them to wear many hats out of necessity. Most people don’t get this opportunity and, as a result, interdepartmental communication can sometimes be strained and counterproductive. Thus, being able to translate the needs of stakeholders (who may not occupy technically-focused roles) into terminology that makes immediate sense to IT allies is crucial. And the reverse is important too! Translating tech-speak for stakeholders is important so that they are aware of information that applies to their parts of workflow, and have emotional investment in the process. Exercising this muscle is not unlike going through what librarians call a “reference interview,” where the librarian seeks to isolate and articulate the essence of a patron’s request regardless of its appearance at face value. So, call upon your love of words, an interest in technology, and a desire to see disparate groups of people collaborate effectively, if you want to avoid misunderstandings. Not only will you maximize efficiency, stakeholders will grow to appreciate your connective problem-solving abilities.

2. Documentation. Sometimes, things get lost amidst the onslaught of details, meetings and challenges inherent in any large project. It is easy to let dreams run ahead of drudgery. It’s a good thing librarians have extensive experience in documenting and preserving information. Naturally, DAM managers can emulate this when creating an IT governance policy. You will make massive inroads in your IT relationship if you have an established guide to your DAM’s technical construction and operation. This includes, but is not limited to, a complete roster of any and all hardware; a list of employees involved with DAM operations along with their job titles, departments, contact information and DAM responsibilities; and troubleshooting procedures complete with scenarios and step-by-step instructions on what is necessary to resolve issues. Your integration team will likely need to play a role in this activity, and it’s likely an ongoing writing process—you will discover more quirks as time goes on. But having forethought and research at your disposal is a godsend for IT staff who are often expected to divine solutions with little to no information.It’s an important albeit mundane responsibility that makes all the difference in the world when the other shoe drops.

3. Distillation. Librarians are experts at squeezing out every possible drop of budget in order to maximize resources. After all, being able to prove a library’s usage rates and value are an important part to keeping the doors open year after year, whether in a busy urban public library or a special library. This same pragmatic approach is a natural fit in a DAM environment. IT sees system upgrades and customization in purely ROI terms—what the bottom line is, given that budgets and agreements are hammered out far in advance. This may conflict with the expectations and desires of some stakeholder groups for how they feel DAM should operate. Some requested features and solutions require hours committed to conceptualizing, testing and implementation that may prevent limited resources from being spent on additional issues. Other solutions may be great for large organizations, but unfeasible for smaller ones. Others, still, may be wonderful ideas, but only beneficial for a small subsection of the entire user base. Paring down high-level concepts into practical analysis will make it easier to arrive at informed solutions. This will also be the difference between overcoming unexpected setbacks and becoming mired in them. Have you truly delineated the must-have solutions from the nice-to-have solutions? The inverse is also true: Knowing when to stick to your guns and push for important solutions that others may hastily appraise as unnecessary can be a lifesaver later on.

4. Coordination. In the ensuing inconvenience that accompanies any major maintenance activity or upgrade, digital asset managers must be adept at ensuring that all participants and moving pieces are attuned to each other. Much like running a busy library or archives requires a librarian to be on top of scheduling and coverage during a time of crisis, a DAM manager must constantly analyze workflows to devise workarounds that minimize impact to business, bottom lines, and their stakeholders’ patience. Are upgrades being done at a time when the office is closed? Will this planned outage affect any imminent deadlines? If something goes wrong, is there enough time to diagnose and fix before business resumes? Is there a temporary system to facilitate requests, so as not to impact normal business? Keeping things humming along is not easy, but being caught off guard is even more difficult.

5. Preparation. Even the most dedicated librarian occasionally prioritizes to create some kind of order amidst chaos. The IT world is no different. When working with IT to isolate glitches and deploy solutions, a little groundwork and attention to presentation goes a long way. This is especially important if your team does not have a dedicated support staff, which is often the case. If a user discovers a bug, make it a point to augment your support tickets with enhanced information that can help narrow focus to possible solutions, speed up response time, and present themselves as well-founded and cohesive. These could be in the form of screen shots. Or an especially crafted step by step guide on how to reproduce an issue. Reference-ready contact information for other stakeholders required for collaboration is always helpful. Depending on how comprehensive your reporting services are, the exact times, conditions and environment of the incident at hand could be key to solving the problem. Or throw in some freshly baked cookies. Whatever separates your issues from the slew of other frustrated, cryptic missives that are sometimes received is a sound investment not only for your DAM project, but your own relationships with your IT colleagues.

Of course, these are only a few pieces of advice from a long line of librarian lore. But with some extra attention, using these fundamentals to your advantage can keep your customization and upgrade processes on solid ground. And as a member of DAM Guru Program, a friendly librarian is only an email or phone call away.

About Collin Rickman

Collin Rickman earned a Bachelor of Arts in Digital Technology & Culture at Washington State University, and his Master’s of Library and Information Science at San Jose State University.  His career began in archives, special libraries, and film preservation; but a twist of fate led him to sunny Southern California and his current position as an assistant DAM manager for Oakley. When he’s not lost in DAMworld, his interests also include net neutrality, information secrecy, e-waste recycling and gamification.

Collin has been a DAM Guru Program member since August, 2014. Connect with him on LinkedIn.


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

DAM User Adoption and Training

by Margie Foster, MLIS, Information Management

This is the story of two DAM systems at two different companies.   The companies shared many similarities.  In both cases, the digital asset management system was relegated a home with the design team, where the managers understood the need for a searchable collection of assets that could be reused at significant cost and time savings.

At Company A, user adoption was a key consideration from the outset.  Before choosing a system, they formed a team that included likely power users, and they secured an executive sponsor. In addition, key users (tipping point people whose adoption would best promote success) were added.  Research and the eventual RFP proceeded from there.  The two best systems were brought in for a test period.  Volunteers from the user group completed a set of basic exercises. After each exercise, the user was asked to rate the experience on a scale of 1 to 5.  The results were tallied and shared with the team.  The stronger system was chosen.

At Company B, however, users were never much of a consideration. The design manager persuaded the next level manger to include a DAM in the next budget cycle, with the understanding that the system would be rolled out to the rest of the company a year later.  No users were involved in the selection.  No systems were tested in-house.

Which DAM system was successful: Company A, with user buy-in, or Company B with not a user in sight?

Certainly, Company A scored the initial success; but the story doesn’t end there. User adoption is not a one-time event. Early adopters can fall away, becoming disenchanted over time, while others leave the company as new users come onboard.  Company A had to find a way to support those early adopters and grow more. The challenge at Company B was more obvious, and not unlike trying to climb Mt. Everest in a lead snowsuit during a blizzard: It could be done—maybe—but the odds were not good.

Both Companies hired DAM librarians who developed user-training programs, with the goal of user adoption and retention. User training brings the traditional talent of a librarian—that loves-helping-people orientation—to the fore.  Every librarian knows that different users prefer to learn in different ways, so it is a standard best practice to have multiple paths to the training:

  • Some people want to dialog every time they need assistance.  (This usually comes from the infrequent user).
  • Some need custom finding aids that address their specific needs.
  • Others are good with screen shots in an email.
  • Few rarely have time to read blog entries about a new features until they need those features, so the blog doubles as an FAQ.
  • Certain power user groups are best updated in their team meetings as part of a regular round table.   (Those often prove invaluable feedback sessions as well.)
  • And there are those whom, no matter how much training is available, simply will not use the system directly—they must have research done for them. (Company B initially had far more in this last category.)

In addition to the measures already listed, the DAM librarians instituted benchmarks, audit-ability, metrics, and reporting. Success was defined and measured.  Goals were set.  At Company A, the analytics were utilized to justify system upgrades.  At Company B, the analytics could not save upgrades from being cut out of the budget.

In the end, Company A’s DAM fell victim to advancing technology outside its system. The search engine was no match for Google and dissatisfaction grew among the users. DAM system updates were never as advanced as they needed to be.  Before long, another team was assembled to begin exploring a replacement system.  But the users never questioned whether a system was needed.  In this case, user adoption and training worked to keep the user engaged and a DAM deployed.

At Company B, the system was adopted officially, but never in actuality by the power users.  Despite all the types of user training made available, and an actual rise in the number of users, the power users persisted in keeping stashes of images on their hard drives. In the end, they had a system of last resort, and a lot of duplicated assets in various silos.  Worse, the system software was not upgraded consistently; its hardware was never replaced or migrated to the cloud.  Gradually, even the occasional users dropped off, opting instead for direct research requests to the DAM librarian.  Unfunded and unused, any current DAM version had to be retired by the software vendor before Company B would grant funds for an upgrade.  Despite the lobbying efforts of the DAM librarian, the opportunity to replace the system was squandered.  Its chances of successful adoption were not improved.

There are no guaranteed successes in this story—no special tricks or tips to be employed.  DAM systems are vulnerable to budget shortfalls, top-down mismanagement, the limits of their own technology, and whether or not the users have a better, faster workaround.  But a DAM system with user buy-in, and a DAM manager willing to work to retain that buy-in, stands the greater likelihood of initial adoption and solid retention. Users who know how to best use the DAM particular to their specific needs, whose searches consistently return with high precision and low recall, are much more likely to express satisfaction. As every good DAM librarian knows, the combination of strong, measured, user-adoption rates and training is a far, far better thing, whatever the circumstances, and sometimes the user’s only hope.

About Margie Foster

Margie Foster began her career in Digital Asset Management while working as a photo editor for an educational publisher.  As the need for digital asset management grew alongside the surge in electronic publishing, she became an advocate of enterprise wide systems development.  This culminated in her role as Manager of Intellectual Property where she led a group responsible for image research, digital asset management, and project file archives.  It was also during this time she earned her Masters of Library and Information Science degree.  After the birth of her twins and a brief hiatus, Margie took a position with Freescale Semiconductor, where she is presently employed as the DAM Librarian.

Margie has been a DAM Guru Program member since 2013. Connect with her on LinkedIn.


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75

DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Controlled Vocabulary for DAM

by Tracy Wolfe, MLIS

One of the ingredients essential to a successful digital asset management implementation is a controlled vocabulary. A controlled vocabulary (CV) is simply an established group of terms used to describe assets in the DAM. Controlled vocabulary can help users search more effectively. Most importantly, a controlled vocabulary lends consistency and ease for anyone adding assets to the library.

Controlled vocabularies are used for websites all the time, especially to enable search on e-commerce sites. Employing the same techniques for DAM makes sense, as many users expect the DAM search experience to mimic the internet.

What is the best way to create a controlled vocabulary? Whether you need a couple hundred keywords to describe your content or several thousand (or millions), the steps to initiating the vocabulary are similar. Depending on the complexity of the filters and metadata fields you choose to utilize, you may construct more than one CV to back-up your DAM system.

This article explains the basics of creating a term list. These steps can be repeated to create multiple lists for use in facets, filters or hierarchies forming the basis for a DAM taxonomy.

  1. Before collecting terms, first consider the content you need to describe. Do your assets focus entirely on one product or initiative or many? How do people in the organization talk about the content?
  2. Who will be responsible for maintaining the CV? Will the DAM manager add or change terms, or will multiple people share this responsibility? Either way, write down instructions or guidelines for adding or changing terms.
  3. Think about the users. If your DAM is internal and everyone in the organization shares the same language surrounding the assets, you will require fewer terms that may be very industry specific. If you will be serving both internal and external users (like vendors or travel agents or franchise owners, etc.), the keywords or terms will need to be more universal.
  4. Is there an established vocabulary available to provide a starting point? There are quite a few vocabularies available as examples or models depending on the subject matter. From Library of Congress Subject Headings (LCSH) to the Getty Art & Architecture Thesaurus (AAT), librarians have compiled trusted standard vocabularies for many domains.
  5. How does the DAM system handle tagging or keywording? Are hierarchies allowed or will your workflow be more streamlined if the keywords are attached to files prior to ingestion? Should the vocabulary be a text file, Excel file or built in some other way?
  6. Okay, now the fun part begins. If you are building the vocabulary from scratch or adapting another vocabulary, you will collect terms from various sources. Decisions will be made regarding the final set of terms and all of this work should be documented for future reference.

Where can you find terms for the controlled vocabulary?

  • Does your organization have a website? Check out the site map and the search logs to find words that users use to search.
  • How do people in the organization describe what you do? If you sell a product or products, will you need a list of brand names? What departments will be supported by the DAM? Will the users search on things like color for design purposes?
  • Are you describing photos, videos, audio files or more? Consider the differences in describing two-dimensional versus moving images and the breadth of additional terms that could enhance findability of all types of media files.
  • Look at similar organizations. Competitor’s websites, trade journals, and articles on the internet can all provide ideas for a term list.

At this point, you probably have a list of many words. It is time to organize the terms, establish hierarchies if applicable, and to decide on preferred terms and synonyms or variants. A standard way to go about this is to designate broad terms and narrower terms.

In this example, we will look at the term Handbags and the broad terms (BT) and narrow terms (NT) related to Handbags.

Handbags
(BT) Accessories

(NT) Bucket Bags

(NT) Crossbody Bags

(NT) Totes

Naturally, this type of work is sometimes challenging in a group, so agreeing upon the decision making process, primarily which keywords will “win” or be included, is the most crucial step at the outset. Like any document, the vocabulary can be edited and altered over time, but finalizing the initial list will allow it to be tested and used to inform these updates.

Once a controlled vocabulary is in use, reviews and updates are always expected, but the main advantage of having created the vocabulary in the first place is consistency. Maintenance of the CV can be ongoing or the vocabulary can be reviewed and refined periodically. Make sure to include the controlled vocabulary in the overall best practices and standards for the DAM.

The best thing about a controlled vocabulary is the improved findability of assets. Knowing what to expect by having a framework for describing assets helps both when adding assets and when performing searches. The time invested in building a controlled vocabulary will provide a huge return and positively impact the experience surrounding the digital asset management system in your organization.

About Tracy Wolfe

Tracy Wolfe worked as an advertising producer for 13 years and managed a digital asset management system at DDB. Tracy pursued an MLIS at San Jose State University as a result, focused on emerging technologies and has been working in DAM or search ever since. Tracy managed a DAM at Corbis Images for a high profile non-profit client and is currently working in Search Strategy at Getty Images. Ms. Wolfe’s interests include taxonomy, helping creative users find assets, and streamlining pretty much any process. Tracy has been involved with digital asset management for almost ten years.

Tracy has been a DAM Guru Program member since 2013. Connect with Tracy on LinkedIn.


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

DAM and the Art of Governance

by Tracy Wolfe, MLIS

Governance is defined by Dictionary.com as ‘government; exercise of authority; control.’ In a digital asset management scenario, the most basic definition of governance boils down to which users have access to which assets and why. A solid governance strategy stretches beyond this to encompass the parameters of the system itself, as well as the information traveling with each asset within that confine.

Naturally, different DAM systems handle user governance differently with permissions on an asset level or more commonly, by folders or galleries in combination with user groups. Sound complicated? It can be, especially if proper planning, communication and documentation is lacking.

Do you really want the design studio preparing the annual report to have access to the photos of your company’s CEO at the holiday party? Are there rights managed images in your DAM and would it be costly and embarrassing to be slapped with an infringement for usage on the web if the licensing is for print only? Does this all make your head spin when you multiply the number of DAM users by the number of assets and possible combinations?

If so, do not fret. There are ways to simplify governance and to ensure that the proper folks have access to only the assets needed, no matter how your DAM system is structured. And, while a ton of articles mention that governance is important in terms of best practices for DAM, not many discuss how to actually implement a governance strategy, to discuss the topic with DAM users and stakeholders, to manage the inevitable changes and how to retain your own sanity in the process.

First and foremost, a governance strategy for digital asset management should be distinctly different from the IT governance plan for the organization overall. When developing a governance strategy for a DAM system, take into consideration the following things:

  1. What is your organization’s organizational culture? If the culture is somewhat loose, as in many creative agencies, simplicity will be key in establishing governance practices.
  2. If the DAM system will be used by multiple departments, set up a governance committee at the outset with members representing each user group and IT. Meet briefly and frequently – communication is key.
  3. Decide who will apply metadata, how will it be applied, and who will create the metadata. Will it be partially a result of embedded fields like IPTC or will there be a set of pre-defined elements or a standard employed like Dublin Core?
  4. Is there an established taxonomy or controlled vocabulary? If not, can you collect data to establish a controlled vocabulary? In conjunction with the vocabulary, can you utilize a single taxonomy or classification hierarchies organization-wide?
  5. Do you need a retention policy and retention schedules? Do your assets expire or lose relevance over time? Is there a need to archive assets after a certain point? Document all of this, no matter how minute or obvious it may seem now.

Whether you are launching DAM for the first time in your company, reigning in a system that has been in place for some time, or simply revamping the policies already in place, governance strategy should be well documented. A good DAM manager, like a librarian who is differentiating between reference-only items and circulating materials, will keep records. These may take the form of spreadsheets or flowcharts in a secure location delineating user group permissions, asset restrictions, metadata fields both required and optional, workflows, controlled vocabulary terms and taxonomy structure.

Additional tools can be used to store user logins, manage taxonomy, and automate metadata application, complementing and contributing to the governance strategy.

The most important aspects of the governance strategy are the organizational buy-in on the policies for digital asset management and the documentation of these rules. The benefit will be the ease of decision-making enabled by an established governance plan. Don’t worry about how formal or official these policies may be – the value is in having the discussions leading to the creation of the governance plan and simply in having it all written down.

About Tracy Wolfe

Tracy Wolfe worked as an advertising producer for 13 years and managed a digital asset management system at DDB. Tracy pursued an MLIS at San Jose State University as a result, focused on emerging technologies and has been working in DAM or search ever since. Tracy managed a DAM at Corbis Images for a high profile non-profit client and is currently working in Search Strategy at Getty Images. Ms. Wolfe’s interests include taxonomy, helping creative users find assets, and streamlining pretty much any process. Tracy has been involved with digital asset management for almost ten years.

Tracy has been a DAM Guru Program member since 2013. Connect with Tracy on LinkedIn.


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Standards and Metadata

by Lisa Grimm, MA, MS-LIS

While librarians love global standards and useful metadata, even within the traditional library, we can be confronted a less-than-consistent institutional approach to those standards, and even wider variation in the tools used to maintain good metadata. That’s especially true in the DAM world, where things like Dublin Core or MeSH can sound like mysterious codes, or even foreign languages, to someone who didn’t attend library school. And if you’ve come into the field from a ‘straight tech’ or marketing route, you may feel you already know everything you need to about metadata – it’s always been important for search and SEO, and few people knew or cared about standards there, right? On the flip side, degreed librarians may throw their hands up in dismay at how different DAM vendors approach metadata management – those global standards can be difficult to implement, even with the best of intentions. Let’s try to clear up the picture.

Types of Metadata

As a DAM professional, you already know the value of good metadata – you can’t find or properly manage your assets without it. You’re already using a variety of flavors of metadata within your DAM – some of it is descriptive, to help power your DAM’s search capability; some is administrative, so you can track an asset’s usage and rights, while those free-text fields may serve as a catch-all for everything that didn’t quite fit – or as a workaround for something your system doesn’t do without considerable tinkering. NISO likes to add a third broad category for structural metadata, but that’s (usually) less relevant to a DAM – it may be called upon to drive display or layout of a page, printed or otherwise. If you’ve spent a lot of time working with XML or ePub files, you’ll know what structural metadata looks like, but it’s less generally applicable to your images, illustrations and videos, at least within the DAM itself – they may certainly end up being called or described in those files in the wild. Whether you knew it or not, you have an in-house metadata model, and you may want to refine it or change it altogether.

Controlled Vocabularies & Benefits

Librarians love controlled vocabularies, and tend to wax lyrical about their favorites, like the Library of Congress Subject Headings (LCSH). But for our purposes here, a controlled vocabulary can be as simple as a picklist in a dropdown menu.

If you can pre-populate your DAM’s metadata fields with commonly-used terms and names that make sense for your DAM, you can reduce the scope for user error, thus ensuring that you keep your assets easily findable – no typos or three different names for the same agency or product (though more on related terms in a moment). Of course, that may not be as easy as it should be with your system – but we’ll look at some strategies there in a moment as well.

Another benefit of going with an existing standard is interoperability with other systems;: if your DAM ties into other systems, be they for rights management, HR, licensing or translation, using the same internationally-recognized standards for your metadata model may make everyone’s lives easier as digital objects travel across your technology ecosystem.

Existing Standards

First off, there are a lot of standards out there. Committees have spent unpaid months and years creating and refining them, and most of the time, they’ve ended up with a pretty sensible set of terms for their given brief – no matter how specialized your assets are, one of the existing standards is probably a good fit, at least as a baseline, so there’s no need to start from scratch. We’ll look closely at the more generally applicable ones, and then mention a few specialized options.

Dublin Core

Dublin Core or, more fully, the Dublin Core Metadata Initiative (DCMI), has been around in one form or another since 1995, when it was first mooted to help give more structure to web resources to make them findable – something any DAM professional can empathize with. The ‘Dublin’ in question here isn’t the one in Ireland, but rather, Dublin, Ohio – the initial workshop, sponsored by OCLC and NCSA. OCLC, known by its initials to any library professional, maintains (among other things) WorldCat, the global catalog that stores data from more than 170 libraries around the world. NCSA produced the first widely-adopted browser, Mosaic, which would eventually be reborn, phoenix-like, as Mozilla Firefox – but we digress.

Getting up to speed on Dublin Core is easy. (There are regular webinars on the DCMI site, but they may be more in-depth than what you need if you’re just beginning to implement some basic metadata standards.) You can learn a lot just by looking at some Dublin Core in action, whether it’s expressed in XML or in the metadata fields your DAM has already.

The beauty of Dublin Core is that it’s nearly endlessly extensible, though its core of 15 top-level categories, known now as the Dublin Core Metadata Element Set, are broadly applicable to almost any digital object. They will look like (at least vaguely) familiar metadata fields to most DAM users. Indeed, some systems have nothing much further than free text fields with these labels when they first arrive, out of the box:

  1. Title
  2. Creator
  3. Subject
  4. Description
  5. Publisher
  6. Contributor
  7. Date
  8. Type
  9. Format
  10. Identifier
  11. Source
  12. Language
  13. Relation
  14. Coverage
  15. Rights

But these fields, and the large variety of other Dublin Core descriptive terms available, may be used differently in different DAM solutions. And not every field, even of the core fields, is relevant to your particular assets. So it’s all about customization; we’ll dig into that below.

XMP

On the face of it, XMP sounds fantastic – you can embed (much of) the metadata you need right into your digital object! You can even use Dublin Core or another existing standard as the starting point. But actually implementing XMP as a standard for your DAM can be tricky, unless you have total control over the creative process from start to finish, since XMP is generally embedded via Adobe Photoshop or Bridge (XMP began life at Adobe, after all). Getting agencies to understand and follow your ‘rules’ isn’t always as straightforward as it should be, and while some DAMs do let you add XMP to assets, often, you need to rely on whomever created the file – and even then, it may not apply itself to every file type.

Another question to consider is whether your DAM’s search can index XMP – is that information being used by your system, or is it lost in the ether? That said, XMP can still be useful, even if it’s not powering your DAM’s search results. Licensing and other rights information can be built in and tracked throughout the asset’s life cycle, provided, of course, the XMP actually travels along with the file as advertised. As of this writing, there is certainly potential, but it may be more trouble than it’s worth for most time-crunched DAM administrators.

Other Standards – MeSH, ULAN, AAT, etc…

Even if you do opt for Dublin Core (or a Dublin Core-light) approach, you may want to seek out some of the more specialized options that exist. If your DAM supports medical or pharmaceutical assets, MeSH may be useful. For art-related collections, ULAN and AAT are incredibly thorough. There are many other unique standards, and in most cases, you can use them as a sort of ‘bolt on’ to your main underlying metadata model.

Customization and Implementation – the ‘How’

Once you (think) you have settled on a model, the real work begins – figuring out how to actually get your chosen model into the system, and how you want to approach applying it to your assets, whether they are newly-imported or legacy files. And while some DAMs will let you test and preview changes within the system, that’s more the exception than the rule, so we’ll assume for our purposes here that much of the upfront work will need to be done outside the system – then we’ll move on to implementation.

1. Analyze existing data:

  1. Does your DAM store user search terms, abandoned searches and user journeys through the system? This is wildly useful in refining your model, especially if you want to use, say, Dublin Core, but you notice that your users don’t seem to employ terms like Creator or Contributor. If collapsing those two fields into something more like ‘Agency’ or ‘Photographer’ works better, that’s great information.
  2. Are there metadata fields that are left consistently blank? It may be that you don’t need them, or that their purpose isn’t understood and that they need to be re-labeled.
  3. Do you have free-text fields that would be better served with drop-downs (e.g. list of agency names, products, countries)? Make note of them before you move on to the next stage.

2. Avoid metadata overkill:

  1. More isn’t always better. Not only do you need to make sure your fields are properly filled out, but if you have too many search terms, you may not get granular enough results.
  2. Just because a field exists in Dublin Core (or another existing standard) doesn’t mean you need to use it, or to use it in the ‘preferred’ way. If something else works better for your organization, feel free to make changes; just be consistent in your approach.
  3. Consider the maintenance ramifications if you do use a large number of fields – what happens if you need to modify them? This may be only a minor consideration for some DAMs, and a huge lift for others.

3. Plot out your proposed changes:

  1. Hit the spreadsheets! Before doing anything else, list your current metadata fields and any controlled vocabularies (whether they are in a dropdown or maintained elsewhere).
  2. On another tab, list your would-be changes, and note how they map to, or replace, existing fields. Color-coding can be very helpful.
  3. On a third tab, list any fields you want to remove entirely. If you know how many assets they may apply to, add that information. Also list net new fields. You may have this listed on your second tab, but it can be helpful to see it at a glance, especially when you move on to the next phase.

4. Get feedback:

  1. Talk to your users! Take time to walk through your proposed changes with some key users, and modify your spreadsheets accordingly.
  2. Card sorting exercise. You can do this in person with some of your users, or conduct a virtual card sort if your team is spread out geographically. There are a number of sites that offer free trials to their card sorting tools, or, if you have the budget, it can be well worth exploring in more depth. Knowing how your users categorize your assets – at least in very high-level groups – can tell you what you need to improve about your model. It will also highlight areas of confusion, and is a great way to test whether a particularly field is of any use at all, or if it needs to be re-named. You can use your spreadsheets as a starting point.

5. Test & Implement:

  1. Get your new fields and drop-downs into your DAM, but keep it to a staging environment at first. Again, this step may be minor, or a very complex exercise, depending on your software and configuration.
  2. Perform user acceptance testing (UAT): ask users to test drive the modifications to the system to see if your hunches about useful terms and fields were correct.
  3. If UAT went well, and the metadata mapped to your existing assets as expected in your testing environment, you’re ready to push those changes live!

6. Communicate:

  1. Let your users know that change is afoot – give them a heads-up in advance, and as the changes roll out. Whether that’s with a notification in your system, an email alert or a personal communication let them know that you’re working to make use of the DAM easier for them.
  2. Ensure it’s a two-way street – do they have an easy way to let you know they need help, or if they have suggestions for your next round of changes?

But My DAM Won’t Let Me Change It (Easily)!

It’s all well and good to think about how your metadata model will work in an ideal world, but you may have a DAM that makes such changes hugely cumbersome. You are not alone. While some DAMs have been thoughtfully designed with the user—administrative or otherwise—in mind, others make changing your metadata model extremely difficult.

If you’re one of the lucky ones, adding or modifying metadata fields can be done through your user interface – you’ll just want to ensure you have a governance process in place so that only administrators (or other trusted users) can make changes to your fields. You may even have a handy taxonomy management tool built in that will let you create related terms, ensuring that your users who search for ‘soccer’ also find ‘football’ if that’s what they were expecting. Many systems even let your users add their own tags to assets, and you can ensure good metadata hygiene by regularly reconciling these crowdsourced tags with ‘approved’ terms.

Other forward-thinking DAM vendors let you edit metadata in bulk. While it seems that this should be a standard feature, it’s noticeably absent in quite a few solutions, so it adds to your slate of maintenance projects when you need to do it manually (or if you need to write a script to make it happen). Adding a field that needs to be applied to thousands of assets, or modifying one that’s already in use with an equally-large number, is very straightforward in some DAMs. But can be a huge project requiring considerably IT support in others.

Most seem to sit somewhere in the middle: in many DAM solutions, it may require a bit of front-end scripting to make those changes, or even a full-blown dive into back-end programming. If you’re managing one of the more cumbersome systems out there, and making changes is something that needs to be its own project, you’ll quickly run into an even-more-pressing need for governance. Which leads us to the next potential problem (or opportunity).

But I’m Always Making Changes!

Regular maintenance is the key. You’ll find all manner of best practices, but you’ll need to decide what works best for your DAM. Do you have quarterly reviews of your metadata model? Are you constantly adding new keyword terms to keep up with new content types or products? Could you group those more efficiently in a standard field? Are they not easily findable as they are tagged now? Most importantly, what terms do your users actually employ?

In short, you’ll want to come up with a variation on the following steps:

  • Create a metadata governance team – build in a regular cadence to meet with key users and stakeholders, and keep communication lines open.
  • Stick to your review schedule – don’t let maintenance become eclipsed by other projects.
  • Determine technical challenges – if changes to your model are always going to be a high level of effort, can they be coupled with other technical projects (e.g. upgrades, UI changes)?
  • Test and re-test with your users: yes, it takes time, but it’s always a worthwhile exercise.
  • Communicate: let your users know beforehand if you’re making major changes, and make sure you help them navigate them when they go live.

Parting Thoughts

The perfect metadata model is always a moving target. But even as an ongoing work-in-progress, using existing standards can help simplify the process of determining your core fields, and how you want to use them in your DAM. But never be afraid to deviate from a standard if it simply doesn’t make sense for your organization, as long as you maintain a consistent approach. You can create your own in-house standards when no others fit the bill, but you can avoid reinventing the wheel for a goodly portion, simply by exploring the metadata standards landscape. It’s partially a well-signposted journey, but certainly requires some traveling off the path!

About Lisa Grimm

While in grad school for archaeology, Lisa Grimm fell into a career as a web developer (back before HTML had tables), and bounced from London to Silicon Valley, then on to NYC and Philadelphia, focusing ever-more on content and digital assets as she worked in tech, government and publishing. Midway through her career, she went to library school to obtain an MS-LIS degree, and left ‘straight’ tech to work in DAM for a number of libraries, archives and museums. She’s back on the corporate side now, serving as Content Librarian for GSK, where she oversees the company’s DAM ecosystem, taxonomy and metadata standards.

Lisa has been a DAM Guru Program member since February of 2014. Connect with her on LinkedIn.


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Why Librarians Understand DAM

By Linda Rouse

The social profile of librarians as “custodians of knowledge” in the community (despite the often derogatory stereotyping) attracts people who have a curiosity and interest in information and research, and in developing the requisite skill sets to become knowledge or information workers.

Librarians understand assets. One of the key factors taught in library schools is that information is valuable and knowledge is power, and it matters little the form—it may be a book, magazine, picture, video or any of the myriad digital formats that make up the world of information today. So managing images and videos is not so very different from managing books and journals—many of the same rules apply.

Librarians learn to catalog and classify items according to global standards. We learn about collection management. We identify different editions and formats for version control. We understand the importance of governance in managing assets.

We have the expertise to research and apply metadata schemas and taxonomies. We understand the business value of efficient asset discovery and findability. We know about copyright and intellectual property, and we can write or develop appropriate policies for effective digital rights management.

These skills can each be readily translated into the world of digital asset management. In fact, these skills are among the first that DAM managers not from library science backgrounds need to learn.

It’s when we think about the adoption of DAM, user training and best practices that the experience of the librarian really comes into its own—from the public librarian organising reading aloud groups for children, to the many special librarians producing what’s new lists for their clients.

Librarians are skilled at encouraging and training users to find materials that match their needs. We know that different types of users require different strategies and methodologies to inform and empower them to use a system, and we are experts when it comes to developing best practices to meet these requirements.

So when in doubt, ask a librarian!

About Linda Rouse

Linda Rouse, BA DipLib AALIA (Associate of the Australian Library and Information Association), has been a practicing librarian for many years. Her career started at the University of New South Wales, Australia, where she acquired her post-graduate Diploma of Librarianship. Rouse then became a cataloguer and later a reference librarian for the State Library of New South Wales, and spent a further 10 years doing electronic research as a freelance contractor. The lure of the Internet tempted her away from traditional librarianship to educate users on ’Net searching and building Web pages. Rouse became involved with Digital Asset Management in its early years, crediting the industry’s “Mother of DAM,” Jennifer Neumann, for much of her transitional training. She has since been dedicated to the promotion of DAM through education in her role as Information Manager for Australia’s DataBasics.

[box]
Linda Rouse passed away on 11 March 2017. She was steadfastly dedicated to the Library Sciences and helping others with their information management goals. She was a longtime member and supporter of DAM Guru Program. She will be missed.
[/box]


Read more from the “Librarian Tips for DAM Managers” DAM Guru Program series »

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Librarian Tips for DAM Managers

An article series by DAM Guru Program library science professionals

By David Diamond

It took the digital asset management software industry only about 15 years until we started to recognize that this radical new thing we created had actually been created long ago. From taxonomies to metadata to categorization systems and more, what DAM advocates proclaim to be the future of content management has actually long been its history too.

We built a bridge between traditional libraries and libraries of the future and promptly forgot to include the very people who could make that transition work best—the librarians, archivists and information professionals and other library science professionals whose training and experience are all about what we do.

Some argue that Digital Asset Management and Library Science are on a collision course of fate, where one becomes the salvation of the other, enabling both to prosper in the coming decades.

I say that collision has occurred.

DAM Guru Program #GuruTalk profiles have introduced us to DAM managers from a wide variety of backgrounds. One thing that most of these people have in common is that they have no library science training. Digital asset management technologies have made it possible for many of us (myself included) to learn about DAM on the job—or so we think. But just as using Microsoft Word doesn’t make one a writer, managing a DAM doesn’t make one an information professional.

This “Librarian Tips for DAM Managers” article series is authored by DAM Guru Program members who are trained information professionals. The authors present DAM topics from their library science perspectives, which just might fill in some educational gaps for the rest of us.

Thanks to series coordinator, Tracy Wolfe, and her fellow librarians for offering us these tips about what has worked and not worked for—you know—the past few thousand years or so.

David Diamond
DAM Guru Program Creator

 

[box]

Librarian Tips for DAM Managers

All articles in this series:

[/box]

Commercial Exploitation of Digital Assets

This article was originally published on the DAM Coalition website, a property of Pro Video Coalition. As DAM Coalition was decommissioned in early 2015, this content was moved with permission.by Nick Sincaglia

As a continuation of the theme from my Finding Inspiration from Metadata Standards article, I want to discuss the challenges surrounding the development of metadata models for the commercial exploitation of digital assets.

One of the benefits of developing a business in the field of digital media is the versatility and flexibility that file-based digital content affords you. However, with these benefits comes the potential for increased complexity in how you manage the metadata for your digital assets—that is, if you want to be able to take advantage of the many new opportunities now available to you. I frequently say to people, “You might know what business you are in today, but are you certain you know what business you will be in tomorrow?” The pace at which businesses must adapt to threats and new opportunities is significantly greater today than just a decade ago. And from what I can tell, there are no signs of it slowing anytime soon.

Listen to the Music (Industry)

So, what are some digital asset management strategies you can apply today to help keep your organization competitive tomorrow? My response to this question is to try to learn from others, perhaps other industries, that might have already addressed some of the challenges your organization or industry is just now beginning to address, or will likely need to address in the future.

When it comes to commercially exploiting digital assets and experimenting with new business models, there is a lot one can learn from the music industry. It is easy to discount this statement, considering what you might have heard from popular news outlets about the troubles the music industry has experienced over the past decade. However, the music industry was the first of the media industries to have to address these challenges head-on, and they have been doing so for well over a decade now. As a result, they have gained a lot of experience and it shows when you look at the data models they have developed and included in their metadata standards.

For the past 10 years, the technical leaders at many of the record labels, collecting societies and digital retailers have been collaborating on a regular basis to try to develop metadata standards that will increase clarity and reduce frictions in their day-to-day business communications. The results of this effort can be found in the Digital Data Exchange (DDEX) standards.

Digital Data Exchange (DDEX)

I would like to focus on one aspect of the DDEX standards, which addresses the commercial exploitation of digital assets.

The DDEX standard is an XML data exchange metadata standard, which is used to exchange information between business partners. Content owners wanted a standardized way of communicating with their business partners that would clearly describe the contractual terms that govern how their digital assets may be exploited commercially. Music can be used and consumed in so many different ways, so great effort was taken to try to define data structures that were versatile enough to handle all known cases.

The DDEX standard encapsulates all its commercial exploitation information within what is called the “Deal” data composite. While no metadata standard is perfect, I think DDEX has done a very good job in expressing the combination of elements one would need to describe the many possible dimensions required in the majority of business scenarios that exist in today’s digital marketplace.

Let’s take a tour through the “Deal” data composite to highlight what is there and how we might be able to learn from or repurpose some of the ideas for use in your business and industry.

Deconstructing the “Deal”

Let’s start with the basics: The first thing to consider is territory information. Every “Deal” must specify the territory for which it applies. Every country has its own unique laws, in terms of taxation, intellectual property and decency, of which one must be cognoscente.

In addition, each territory has its own cultural standards, business environment and popular methods of content consumption that one must consider when defining the commercial terms of a digital asset. Obvious examples are wholesale and retail pricing, which must be specified in the local currency.

Finally, due to restrictions defined by other licensing contracts, release windows and marketing campaign schedules, content owners must define the start and end dates between which these commercial exploitation terms apply.

Once the time and place for which the digital assets can be commercially exploited is defined, the next area of focus for definition is in the content’s “Usage.”

Defining content Usage is more complex than it might seem. Content Usage encompasses not only the means by which the consumer will access or experience the content (download, on-demand stream, conditional download, content influenced stream, non-interactive stream, ringtone, ringback tone, etc.), it might also detail many other conditions that surround the act of accessing and experiencing the content. These might include:

  • the type of device or user interface the consumer is using (mobile, kiosk, personal computer, game system, home entertainment system, broadcast receiver, physical media writer, etc.)
  • how the content is delivered (wired, wireless, satellite broadcast, terrestrial broadcast, p2p, physical media, etc.)
  • the type of carrier on which the media is allowed to be fixated (CD, DVD, Blu-Ray, VHS, etc.)

Each unique combination of “Usage” dimensions may dictate its own set of restrictions or price differentiation. Because of this, the standard must support the ability to explicitly define commercial terms for each described “Usage” combination.

As complex as this might already sound, there are other dimensions one might need to express in combination with “Usage.” For example, one may want to tailor the deal terms for their digital assets to be more or less favorable based on what type of business model their business partner is using. If the content is being included in a monthly subscription service, the content owner might want to restrict certain types of Usage, or include incentives for subscribers to purchase the media. If one’s business partner operates a media rental business, content owners might want to define the length of the consumer’s rental period. Or, if the business partner is a promotional outlet, the content owner might want to define the number of free plays the consumer is allowed.

Another consideration that could affect the commercial terms is a digital asset’s technical specification or quality, such as encoded bitrate (SD, HD, lossless, etc.), number of audio channels (mono, stereo, 5.1 surround, 7.1 surround, etc.), 3D video, etc.

In recent years, there has been some experimentation around offering the consumer the ability to pre-order content before it has been released. Defining the terms that govern when a pre-order deal can be advertised publicly, and the price incentives offered to the consumer, must be something that can be defined and communicated to retailers.

And don’t forget about digital returns. Yes, you read that right.

The concept of a digital return was introduced by iTunes. iTunes allowed consumers to purchase individual tracks on an album but, if a consumer later decided to buy the whole album, iTunes would provide credit for the cost of the individual tracks. iTunes would then message back to the content owners that they were issuing a digital return for those tracks.

Allowing or disallowing digital returns was an unexpected communication requirement that most content owners were initially unprepared to handle.

The last significant component of the DDEX “Deal” describes a means by which content owners can tell their business partners that they are no longer allowed to offer the digital asset to the consumer.

I mentioned earlier a start and end date within the commercial terms, but this is slightly different. There are times when a content owner suddenly loses the rights to distribute a digital asset to their business partners. This can happen through expiration of a license agreement, a lawsuit, or transfer of ownership. No matter what the reason for the sudden change in distribution rights, the content owner must quickly notify its business partners that they no longer have the right to license a particular digital asset, and the business partner must remove the ability for the consumer to access it.

Clearly, commercial exploitation of digital assets is complex today. I feel fairly certain in saying that it is likely to get even more complex in the future.

If some of the circumstances described above apply to your business today, or you foresee your industry trending toward some of them but are not sure how to best handle these types of situations, I recommend taking a look at the Digital Data Exchange metadata standards. There might be some data modeling techniques that you will find useful in keeping your organization agile so that it can take advantage of new opportunities or stave off new threats and remain competitive in the digital marketplace.


Nick Sincaglia

Nick Sincaglia is President/Founder of NueMeta LLC, a consulting firm focused on digital asset and metadata management strategies. Nick’s company provides software development and system design consulting services for the leading digital media & entertainment companies. Nick has been active in several industry technical standards bodies as a Board Member and Working Group Chairman for the Digital Data Exchange (DDEX), NARM’s Digital Think Tank, and member of Metadata Working Groups sponsored by the European Broadcast Union and Audio Engineering. Nick has been a member of DAM Guru Program since 2013.

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

Finding Inspiration (and Solutions) From Metadata Standards

This article was originally published on the DAM Coalition website, a property of Pro Video Coalition. As DAM Coalition was decommissioned in early 2015, this content was moved with permission.

by Nick Sincaglia

The Value of a Metadata Standard

I have been active in a number of metadata standards organizations over the years. While the standards development process can sometimes be painfully slow, I recognize there is enormous value in being involved in such efforts. I have come to believe that if one takes the time to study and understand the design considerations behind the data structures that make up these standards, they will develop new insights and a deeper understanding of how that industry operates.

When designing new DAM systems, I always look to existing metadata standards, which I know have been painstakingly debated through committee. While there is no such thing as a perfect metadata standard, I believe that incorporating some of these elements into my work will likely help me avoid common data modeling mistakes, and ensure I have included important data modeling considerations into my design.

Metadata Standards from Outside the Box

I don’t limit my search for good metadata models to just those standards that have derived from the specific industry vertical with which I am working. Due to the malleability of digital content, industry verticals that were once considered separate and distinct from one another are now beginning to converge. There is a wealth of experience and knowledge that can exist inside these other standards, which comes from people with vastly different work experiences than mine, and of which I am interested in taking advantage. If I think the industry I am working in will eventually experience their same challenges, I will be able to anticipate those challenges by using aspects of these other standards.

One good example of this occurred about year ago, when I was looking for a well designed and versatile data structure to store metadata about creative artists. I wanted to capture the type of metadata that would enable me to uniquely identify a person and, in addition, store a variety of metadata elements that would describe how that person’s identity had changed over time.

I was certain I was not the first data designer to have this need. I did not want to build such a complex data structure alone because it would likely be inferior to one that was collectively defined by a knowledgeable standards community.

It took me about three days of searching, but I finally found what I was looking for when I came across the “Encoded Archival Context – Corporate Bodies, Persons and Families (EAC-CPF)” metadata standard. It was actually more than I was looking for, which was even better! The standard was reasonably new and it had been developed by the international library and digital archive community. Clearly, this was a community that had expertise and knowledge far beyond my own in managing the identities of people, groups and organizations. I was delighted to discover that the standard supported complex scenarios far beyond my immediate needs.

Metadata for Moving Targets

It should not be surprising that the library community had developed such a standard as this—one that addressed the challenges associated with capturing data about an individual or group whose identifying information might change over time.

An example of the value this offers would be a young woman who publishes her master’s thesis to complete her university studies. After she finishes her schooling, she gets married and changes her last name. Several years go by and she publishes more writings under her new marital name. Perhaps, she decides to sometimes use a “pen name” for some of her more controversial writings. Later, she returns to university and completes her doctorate and her name changes again. Perhaps, she is divorced or widowed and later re-marries, and again, changes her marital name. Finally, her writings are so influential, she is receives damehood (the female equivalent of knighthood) by the British Monarchy.

Many works, many names but all from the same person.

The EAC-CPF standard is designed to capture this level of complexity. It was well worth the time it took for me to find the standard. I could confidently design its data structures into my core data model because I was certain it would not only meet my current needs, but could support any future needs I might have.

I encourage you to do the same.

The ideas and knowledge expressed within metadata standards can be leveraged in new and useful ways if you take the time to understand them. Look outside your immediate industries standards and recognize that other industry verticals may have already experienced the challenges you are now facing.


Nick Sincaglia

Nick Sincaglia is President/Founder of NueMeta LLC, a consulting firm focused on digital asset and metadata management strategies. Nick’s company provides software development and system design consulting services for the leading digital media & entertainment companies. Nick has been active in several industry technical standards bodies as a Board Member and Working Group Chairman for the Digital Data Exchange (DDEX), NARM’s Digital Think Tank, and member of Metadata Working Groups sponsored by the European Broadcast Union and Audio Engineering. Nick has been a member of DAM Guru Program since 2013.

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.

The Concept of Trust in Your Workflow

This article was originally published on the DAM Coalition website, a property of Pro Video Coalition. As DAM Coalition was decommissioned in early 2015, this content was moved with permission.

by Nick Sincaglia

I was talking with a client recently about some of the challenges they were experiencing with a new workflow they had implemented. It was a bit of a departure from their existing workflows in that it relied much more heavily on external data providers. Until this point, their existing workflows were primarily driven by data that was originally from internally created data for which they had defined strict guidelines and relied heavily on throughout their workflow systems.

This client had expanded their operations into a new area of business which required them to rely on their outside data supplier partners to a much greater extent for the quality and consistency of the data which entered their systems. Needless to say, these new data streams were experiencing much higher failure rates and, as a result, were stressing their workflow systems in ways they had not experienced before.

The failures were causing a “log jam” of media files that would quickly fill up the workflow storage reserves and clog parts of their IT network, due to the re-routing and reprocessing of these files. A cascade of completely new workflow challenges had caught the organization off guard and was causing a significant amount of stress on their operations.

While talking through these problems that the client was experiencing, the concept of “trust” surfaced in our conversation. Since my background in workflow development has always tended to involve at least some amount of external data integration, “trust” was a concept I had deeply internalized and always included in my workflow designs.

It was through this conversation that I realized that building this concept of “trust” into my workflow designs was something that slowly evolved over time from experience. This client’s existing workflows had made assumptions regarding “trust,” which turned out to be no longer valid when applied to the external data they were receiving.

Trust in Workflow Design

So what do I mean when I refer to “trust” within the workflow design? Trust in this circumstance is the level of confidence one has in the quality and consistency of the data that’s being used as an input to any component within a workflow system. Trust can be applied to your business partner’s data feeds, your employee’s work quality, a server, the network, software applications, hardware devices or sub-components within your workflow systems. Each link in your workflow chain can be assigned a value of trust that you can use to drive your workflow designs and increase the robustness and reliability of your workflow automation.

And as the saying goes, “trust should not be just given, it must be earned.”

This was the mistake my client had made. Their confidence that these new data feeds would have the same quality and consistency as their internal data feeds was set too high. As a result, their workflow systems were stressed in ways they did not anticipate, and now they needed to re-evaluate how they did things.

So, how does one build the concept of trust into one’s system design?

I have learned to develop profiles for the various components that make up my workflow systems. There are no hard and fast rules as to the level of granularity one must establish for these component profiles. It really depends on your situation and the level of detail you want to track, monitor or control.

Each of these component profiles may also include data level profiling. The idea is that you assign a trust ranking to each data input component that can be used to programmatically make decisions on how to process the data.

For example, if your data is coming from a workflow component that has a high ranking of trust, and you receive an update to a previously delivered data record, you might programmatically design your workflow systems to automatically accept that updated data record. If, under the same scenario, the data is coming from a low trust ranking data component, the data update might be further scrutinized by running consistency checks against other data sources, or by routing it to a manual process for human review. Each component within your workflow is designed to check an input’s trust ranking before processing the data and, in doing so, might receive instructions on how to process the data.

Trust as a Variable

Trust ranking is not a bulletproof way to guarantee that your workflow systems will not inadvertently allow low quality data to slip through. There will always be data quality issues that will surface that are unanticipated. However, this approach, if designed properly, will enable one to expand the granularity of these data quality checks and decision-making responses over time. Remember, the data quality of your workflow systems is not static; your data suppliers and workflow components might change over time, and their level of trust might rise or fall.

Before a workflow component starts to process the data it receives, it can be designed to quickly check who the supplier of the data is and what other sub-components had previously processed the data. From there, it can be instructed on what data elements should be scrutinized during the processing of the data.

At the same time, this trust ranking concept should not unnecessarily impede the data flow through your systems. One needs to balance the need for total data quality with the rate at which data must flow through the system. In most workflow systems, it is unacceptable for the output of any workflow component to fall too far behind the data rate of its input.

One situation where I found the greatest need for this trust ranking concept was when I was working with systems that mixed data associated with content from the opposite spectral range of the “Long Tail.” The concept of the “Long Tail” was made popular by Chris Anderson’s book “The Long Tail: Why the Future of Business is Selling Less of More.”

Valuing the Tail End of the Long Tail

One side effect of the long tail that I noticed in my work was that the quality of the metadata degraded the further down the long tail one went. I don’t remember Chris Anderson discussing this “feature” of the long tail in his book. From my perspective this was the “dark side” of the long tail that the book failed to mention.

LongTail

Digital assets that are less popular typically don’t receive the same level of metadata attention more popular assets receive.

In a typical marketplace, there are a number of parties involved making content available. Typically there are creators, manufacturers, distributors and retailers. Even in a purely digital environment, these same roles tend to persist.

The “dark side” of the long tail is the further down the long tail one goes, the less incentivized one is to spend time on metadata quality and consistency issues. Time is money as the saying goes. The fewer copies of the content one expects to sell, the less likely one is to earn back the money invested in the content’s metadata.

If the creators of the content do not supply high quality metadata with their media, the responsibility of doing so is passed to the manufacturer. If the manufacturer does not have the incentive, they will pass the responsibility to the distributor. If the distributor lacks incentive, the responsibility continues on to the retailer. And if the retailer is not motivated to cleanse the metadata, it will simply get passed to the consumer.

So, when you are mixing metadata associated with both front tail content and long tail content, the concept of trust plays a very big role in how you design your workflows. Professionally produced content tends to have much greater metadata quality, because the suppliers of the content have a vested interest in making sure the content is properly prepared so that it can be easily received and processed by each party within the retail supply chain and on to the consumer for purchase. The opposite tends to be true for long tail content, for the simple fact that every minute spent addressing content metadata issues, the lower the probability one will make back the money spent in doing so.

If you think about it, this same situation exists in almost all content environments. Even in your company’s internal content systems, though perhaps not to the same extreme. There will always be high value content and low value content. Too much time and effort spent on the quality and consistency of low value content could result in a net loss to the organization.

Your organization probably already internalizes this reality in the way they run their business by putting more effort into the quality and consistency of the data surrounding their high value content. And if you think about it a little further, shouldn’t your workflow systems also be able to internalize these same concepts?


Nick Sincaglia

Nick Sincaglia is President/Founder of NueMeta LLC, a consulting firm focused on digital asset and metadata management strategies. Nick’s company provides software development and system design consulting services for the leading digital media & entertainment companies. Nick has been active in several industry technical standards bodies as a Board Member and Working Group Chairman for the Digital Data Exchange (DDEX), NARM’s Digital Think Tank, and member of Metadata Working Groups sponsored by the European Broadcast Union and Audio Engineering. Nick has been a member of DAM Guru Program since 2013.

LearnDAM-Logo-75x75DAM Guru Program recognizes this article as worthy of the #LearnDAM designation for materials that provide genuine digital asset management education without sales agendas. Search #LearnDAM on Google for more materials.