Skip to main content

Review

Reviews contain individual feedback related to engagements or turns. Reviews have large scale use in Salted CX. Reviews are created among others in the following cases:

  • During manual quality assurance.
  • Salted CX auto reviewers that review 100% of conversations.
  • Customer satisfaction surveys.

Each individual item in Review data set contains an answer to a single question. So if you review an engagement with a form and you reply to multiple questions within that form, you will have multiple review items related to the same engagement.

Data Set Properties

PropertyTypeDescription
ReviewPIDUnique identifier of this individual review.
EngagementReference to EngagementEngagement this review is related to.
TurnReference to TurnTurn this review is related to.
ReviewerReference to ReviewerReviewer who provided this review. The reviewer can either be a person or a service.
QuestionReference to QuestionQuestion which was reviewed.
Review TimeDate and TimeThe time when the review was last updated.
Review SessionEntityGroups reviews into session so if there are more questions answered during a session they can be reported as one unit.
AnswerEntityWhat answer the reviewer selected. The answer is identified by the text that was provided to them.
SamplingEnumeration
Focused, Random, Selected, All
Sampling method used for selecting which engagement or turn to review. This enables to distinguish mostly from random and focused sampling method.
StatusEnumeration
Completed, Deleted, Ignore, Pending, Rejected, Timeout
The current state of the review.
TypeEnumeration
Agent, Auto, Customer, Reviewer
Type of the review.
Used FormEntityThe form that the reviewer used when creating this review. Be careful that one question can be used in multiple forms. This attribute stores only the form which the user had open at the time when creating the review, not other forms where the question is present.
VerifiedEnumeration
Acknowledged, Correct, Disputed, Incorrect, Unclear
The indicator whether the review is correct, incorrect or eventually acknowledged and disputed by agents.

Acknowledged — The review is acknowledged by a person — typically an agent.
Correct — The review is marked as correct. Typically used to confirm an Auto review provided by an auto reviewer.
Disputed — The review is disputed by a person — typically an agent.
Incorrect — The review is marked as incorrect. Typically to tell what Auto reviews by an auto reviwers.
Unclear — It is hard to tell whether the review is accurate. Typically to tell auto reviewers not to use it as a good example for auto reviews.
Answer ScoreFactThe score associated with the review in the original scoring system. The value is between minimum and maximum score (both inclusive).
ConfidenceFactThe confidence with which the review answer was given. This is useful especially for auto review services.
Minimum ScoreFactThe minimum possible score for the question.
Maximum ScoreFactThe maximum possible score for the question.
ScoreFactThe score associated with the review normalized to percentages.

Sampling

Sampling represents how it is decided that a conversation, engagement or turn was chosen for a review.

SamplingDescription
FocusedAll engagements matching a certain condition were used for reviews. This is useful for reviews intended to get deeper understanding of a potential issue or opportunity. They are typically not a fair representation of an agent performance.
RandomRandom sample of engagements was selected for the review. Reviews with random sampling are the best possible representation of an agent performance.
SelectedThe reviewer selected the conversation ad-hoc. This may happen for example when a user is in customer journey and encounters something where feedback is necessary or helpful.
AllThe review is offered to everybody.

Status

The current status of the review. In the most common cases you want to filter this attribute to Completed value. However the other states can be used for more targeted use cases. See the table below for more details.

StatusDescription
CompletedThe review is completed and it can be included in the reports.
DeletedThe review was deleted and should not be in the results.
IgnoreThe review is not relevant.
PendingThe review is scheduled but the reviewer has not answered yet. This may happen in case you have planned reviews that reviewers should do and you want to report on those that are yet to be done.
RejectedThe reviewer rejected to review the engagement. This may happen in case that the reviewer concludes that the engagement is not a good representative sample to review.
TimeoutThe review was requested but the reviewer has not provided it. This may happen when reviewers did not have time to review or for customer reviews the customers simply choose not to answer.

Type

Type attribute is used to filter the source of the review.

TypeDescription
AgentReview provided by the agent on their engagements. This type of review enables you to gather the agent's perspective.
AutoThe review was provided by an automated service.
CustomerThe review was provided by the customer. This is typical for customer journeys.
ReviewerThe review was provided by a person responsible for quality assurance in the company. This person is different from the agent who handled that engagement.