Microsoft Office 365

Hyperfish – Thoughts on Collecting Expertise

In the announcement of the Hyperfish Integration Framework are three items that can now be collected:

Customers can now use Hyperfish to collect information that has been time consuming or difficult to get collect in the past, such as employee skills and expertise, asset registration, and personal information.

Two are similar, one is very different. The two similar items – asset registration and personal information – extend Hyperfish’s current approach of asking for information that has a demonstrably correct value. What is your first name? What is your last name? What is your phone number? What is the serial number of your laptop, tablet, or smartphone? In what city were you born? What school did you attend when you were 14 years old? Each question has a correct answer for which easy proof can be found (as well, of course, as a set of incorrect answers).

Employee skills and expertise, however, is a very different beast, even though the fields supplied in the Office 365 profile treat it as the same. Collecting, gathering, and discerning expertise for a profile is a complex can of worms; consider:

  • Expertise refers to the ability to perform at an expert level in a particular subject or topic area. Some expertise is explicit (can be documented) and some is tacit (hard to express in words).
  • Declared expertise – where an individual says what they are good at – has a low level of reliability. The individual will be too modest, too extreme, or just plain wrong. One study suggests that any expertise declarative by an individual about themselves should have the status of undefined (unverified).
  • Other people experience the expertise delivered by an individual, and thus are more reliable at stating what someone is good at. The same study above says the individual’s manager is the most reliable rater, and that the ratings from 7-10 additional colleagues are required to match the manager’s rating.
  • Private ratings of other people are more accurate than attributable ratings. “When raters think their ratings will be or could be revealed, 67% of ratings increase significantly and become less correlated to performance.
  • There are many content systems in an enterprise through which expertise can be demonstrated: documents written, blogs posted, discussion comments (“best answer!”) given, and emails sent. There are attempts being made to mine this growing collection of content to map expertise.

In light of the above, if Hyperfish is serious about collecting expertise in a way that’s helpful to an organisation, the approach will need to include the ability to (this list is not exhaustive):

  • Ask multiple people about the expertise they receive from a given individual, and then summarise / rank / rate / scale the result set to give an overall assessment.
  • Be able to integrate with systems that can look through documents and other written forms of expressing and delivering expertise in order reason out key themes and areas of expertise. If an organisation is using such a system, the Hyperfish Integration Framework should allow the creation of mapping and sync rules.
  • Have the ability to automatically populate and control some values in the single expertise field, while allowing the individual to manually edit other values.

Profiling expertise is a fascinating (and highly complex) area. While Hyperfish’s framework will give the technical ability to collect expertise and skills, more will be required to gather correct / accurate / helpful / validated answers.

1 reply »