Skip to main content
All CollectionsFeatures & How-Tos
Setting up idea evaluation
Setting up idea evaluation
Updated over a year ago

Idea evaluation is one of our premium features that can be enabled and configured separately for each board.

Adding an evaluation process is an excellent way to get additional data from users on both the quantifiable and abstract aspects of individual ideas. Unlike, for example, comments, the evaluation feature provides you with the possibility of acquiring numeric data for the analysis of ideas, using freely definable metrics.

P.S. To be able to complete the following steps, you’ll need to have an admin account for the desired Viima board. Please contact your Viima super user for access if you don’t already have an account.

How to get there

If you already know your board name, the easiest way to set up the evaluation feature is through the following URL:

https://app.viima.com/admin/organization-name/board-name/#settings/features

If you do this, proceed directly to Step 3. Otherwise, start from Step 1.

Step 1:

On your Viima Admin account, go to "Settings" in the admin portal of your board.

Step 2:

Open the "Process" sub-section, then click on "Evaluation".

Step 3:

From here, you can set up idea evaluation for your board.

What to change

Now you should have the view from Step 3 open.

You can enable or disable the evaluation of ideas from the “Enable an evaluation process for the ideas” checkbox. To create a new evaluation metric, click the blue “Add new” button.

This should open the following window, where you can specify the details of your evaluation metric:

When you create a new evaluation metric, there are various aspects that you can set depending on what kind of evaluation process you wish to build. We’ll go through each aspect individually by creating an example metric.

Name

This defines the name of your metric and should be as descriptive as possible so that users understand what exactly they’re evaluating.

For our example, we’ll use a very quantifiable metric: “Cost to implement.”

Description

If the metric's name is not descriptive enough for users to understand what aspect of the idea they are evaluating, you can provide more information in the Description section. This will be shown to users under the metric's name in the idea's evaluation tab.

Unit

This defines the unit for which the range of evaluation will be determined. For something abstract e.g. business potential, there might not always be a clear measurable unit, or using such a unit might not make sense. In these cases, the field under Unit can be left empty, as the Name itself defines the context of evaluation.

For our example, since the cost is quite easily quantifiable, we’ll use the euro symbol:

Minimum

This defines the minimum value, or “bottom” of your range. This rarely needs much thought, as 0 or 1 is typically the minimum value for any range. However, this can be freely changed if your custom range requires it.

For our example, we’ll keep it at 0, as it is a logical way to signify “free” as in "no cost involved".

Maximum

This defines the maximum value, or “roof” of your range. For something abstract, such as business potential, it's often best to try to abstract your range, as asking for precise numeric values would likely only lead to false confidence in the numbers. A scale of 1-10, or 0-5 is a good rule of thumb for metrics that don’t have a clear unit.

For our example, we’ll use a number that we perceive to be the maximum possible cost for ideas on this board. For metrics such as cost, configuring the scale may sometimes be difficult as the value can vary greatly for different types of ideas.

Weight

This defines the relative weight of the metric when calculating Viima score for an idea. By default the weight is one, indicating it is taken into account once when calculating the average of all ratings. If you set this value higher, the metric will be calculated multiple times to the average of all ratings. Alternatively, you can set this to zero if you want to ignore the metric completely when calculating the Viima score.

For our example, we'll use the value 2 as the weight of the metric because implementation costs are generally a big factor in everything.

Is a higher value better?

This checkbox defines whether a higher value better than a lower one in this metric. This is usually “higher is better” for most metrics, as it is typically more intuitive to signify a good result with high numbers rather than low. However, sometimes “Lower is better” makes sense too, such as when it comes to costs.

User rights - Who can evaluate ideas with this metric?

From this dropdown menu, you can choose based on different roles who can give evaluations to this specific metric.

Here's a breakdown of who the different options will affect:

  • Everyone: All board users may evaluate ideas on this metric

  • Idea responsibles: Admins may evaluate all ideas on this metric and those who have been assigned to be responsible for an idea may evaluate that specific idea

  • Admins: Only admins of the board may evaluate ideas on this metric

For our example, we'll choose "Everyone" as in this case, we want all users to partake in evaluating the cost of implementation at an early stage to get an overall understanding.

User rights - Who can see the average evaluation value?

From this dropdown menu, you can choose based on different roles who is allowed to see the average value of this metric. The average value is counted based on all evaluations given to the metric.

Here's a breakdown of who the different options will affect:

  • Users who have already given their evaluation: This is quite self-explanatory. Only after an evaluation has been given by a person, will they see the average evaluation score on the slider.

  • Everyone: Every user on the board will be able to see the average evaluation score of the metric after it has been evaluated at least once.

  • Idea responsibles: Admins can see the average evaluation score of this metric for all ideas and those who have been assigned to be responsible for an idea may see the average evaluation score of that specific idea after it has been evaluated at least once.

  • Admin: Only admins may see the average evaluation score of the metric.

For our example, we'll choose "Users who have already given their evaluation" as it is not necessary to restrict visibility from anyone but we do not wish to provide bias to any evaluations by showing the average value before an evaluation has been given.

Related status

This defines the status to which the metric is related. Each metric must be related to a specific status. However, you can create as many metrics for as many statuses as you like. The status you choose is highly dependent on the nature of your metric and the board. Some metrics give much more relevant results at the beginning of your ideation process and some at the very end.

Active categories

This defines the categories in which the metric is present. All ideas in the chosen categories will include this metric in the evaluation tab. If you wish to choose only specific categories, mark the checkboxes next to their names. If you wish for the metric to be present in all categories, leave all checkboxes unmarked. Using different metrics for different categories often makes sense if the ideas in different categories are very different.

For our example, we'll choose "All," as this metric is relevant in all of the available categories.

Now, just click the green "Create" button and your newly created metric will appear on the list of evaluation metrics.

P.S. The metric will only show up on your board after you Save your changes in the “Evaluation” sub-section.

Finishing touches - Status-specific descriptions

We recommend opening the evaluation description from “Edit description” and changing it to something that accurately describes to the users what they are evaluating and why. This helps not only to convey the message and thus motivate users to give evaluations but also increases the reliability of given evaluations.

P.S. This description is status-specific which means that if you have multiple metrics for a single status, remember to address all of these metrics in this field. If you have metrics in multiple statuses, make sure to have descriptions for all of them.

P.P.S. Additionally, remember to use this in tandem with the metric-specific descriptions; describe the big picture what and why in this field, and specific details related to the metric, such as the scale, in the metric description.

After you’re satisfied with the description, click the green “Save” button.


Don't forget to save your changes!

Green "Save" buttons can be found on all pages where changes can be made. Clicking this activates those changes.

Did this answer your question?