Mosaic products documentation: Concepts, API Reference, Technical articles, How-to, Downloads and tools

Add Data Validations


In this guide, you will learn how to add validations for your data. We will describe four situations you may decide to do data validations at.

Frontend Input Validations

In this section, we will explore how to implement field validations in the frontend. Field validations are essential to ensure data integrity and accuracy in your application. We will demonstrate how to use the Yup schema validation library in combination with React workflows to achieve this in the Mosaic Media Template.

Defining Yup Schema

To start with field validations, you need to define a Yup schema that represents the shape and rules for the data you want to validate. Here’s an example of how to define a Yup schema for a movie creation form in TypeScript:

import * as Yup from "yup";

const movieCreateSchema = Yup.object().shape<ObjectSchemaDefinition<FormData>>({
  title: Yup.string()
    .required("Title is a required field")
    .max(100, "Title must be at most 100 characters long"),
  // Add more fields and validation rules as needed

In the above code:

  • We import Yup library.

  • Create a schema using Yup.object().shape({}), where you define the shape of your data object and its validation rules.

Applying Yup Schema to a Station

Once you’ve defined the Yup schema, you can apply it to a Station where you want the schema validations to apply. Here’s an example of how to do this in the MovieCreate station:

<Create<FormData, SubmitResponse>
  title="New Movie"
  subtitle="Add new movie metadata"
  validationSchema={movieCreateSchema} // Assign the Yup schema to the validationSchema prop
    loading: false,
  <Field name="title" label="Title" as={SingleLineTextField} />

In the code above:

  • We assign the previously defined movieCreateSchema to the validationSchema prop of the station. This tells the form to validate the input data based on the specified schema.

With this setup, the form will automatically perform validation based on the Yup schema when a user interacts with it. If any validation rules are violated, error messages will be displayed to guide the user in providing valid data.

For further examples on how to write more complex validations (i.e. Regex matches, etc.), please refer to the Yup official documentation at

Backend Input Validations

In this section, we will explore how to implement field validations in the backend for GraphQL APIs based on PostGraphile.

We will assume that you want to perform input validations for the createMovie and updateMovie mutations.

Create a PostGraphile Wrap Resolver Plugin

A wrap resolver plugin allows you to write code that wraps the existing GraphQL resolver of a field, enabling you to perform validation before or after the original resolver’s execution.

Begin by creating a new .ts file under the graphql/plugins folder or any suitable location:

import { makeWrapResolversPlugin } from "graphile-utils";

const validateMovieData = (propName) => {
  return async (resolve, source, args, context, resolveInfo) => {
    const movie = args.input[propName];

     * You may perform any required validation based on the input data here and throw an Exception
     * to stop the flow to propagate the error to the user.
    await isValidMovieData(movie);

    return resolve(); // This will invoke the 'Original' implementation of the mutation

export const MovieCreateUpdateValidationPlugin = makeWrapResolversPlugin({
  Mutation: {
    createMovie: validateMovieData("movie"),
    updateMovie: validateMovieData("patch"),

The reason we pass in the propName as above is due to the GQL API’s structure for the create & update mutations as below. So we pass-in the correct sub-structure into the validation method to cover both create & update use-cases.

mutation MovieMutations {
  createMovie(input: { movie: { title: "Some Title" } })
  updateMovie(input: { patch: { title: "Some Title to Update" }, id: 1 })

Loading the Validation Plugin into PostGraphile Options

Now we need to load the Plugin into the PostGraphile Options builder so that our wrapper will be applied onto the standard PostGraphile API.

To do this, locate the postgraphile-options.ts file for your GQL service and adjust it as below:

import { MovieCreateUpdateValidationPlugin } from "../domains/movies/plugins/movie-create-update-validation-plugin";

export function buildPostgraphileOptions(
  config: Config
): PostGraphileOptions<Request, Response> {
  return new PostgraphileOptionsBuilder()
    .setDefaultSettings(config.isDev, config.graphqlGuiEnabled)
    .setErrorsHandler((errors, req) => {
      return enhanceGraphqlErrors(
    .setHeader("Access-Control-Max-Age", 86400)
    .setPgSettings(async () => ({ role: config.dbGqlRole }))
      MovieCreateUpdateValidationPlugin, // Add your imported Plugin to the existing list of Plugins
      AddErrorCodesEnumPluginFactory([MosaicErrors, CommonErrors])
    .addGraphileBuildOptions({ pgSkipInstallingWatchFixtures: true })

Now you may test out the Backend validation to make sure everything is working as expected.

Ingest Validations

Up to this point, we’ve focused on "Data Entry" validations directly within the frontend and backend. However, it’s also possible to implement validations during data ingestion if your service supports it.

During ingest, an input document (i.e. a JSON file) will be loaded and the corresponding data will be directly inserted into the appropriate tables. It will not go through the GQL API by default due to performance reasons. However, similar to how we did input validations on the API, we can also perform them during an Ingest.

In the standard Mosaic Media Template, extending the ingest validations are quite straight forward. All custom validations are found in the file ingest-validation.ts (for example: services/media/service/src/ingest/utils/ingest-validation.ts).

Now you may tweak the customIngestValidation as required

export const customIngestValidation = (
  document: IngestDocument
): IValidationError[] => {
  // inspect the input document and return ValidationErrors to propagate them to the user as Ingest Errors

Publish Validations

This is a special case when your service also participates in data publishing logic, such as the Media Service. You may want to perform specific validations to ensure the integrity of the data after ingestion or manual editing, before it is pushed into a different system or service.

In the Mosaic Media Template, it is quite straight forward to perform Publish validations for the services that support it. All custom validations are found in the file publishing-{entity-name}-processor.ts (for example: services/media/service/src/domains/movies/handlers/publishing-movie-processor.ts).

If you are interested in Movie publishing, adjust the customMovieValidation function in the aforementioned file. This function is passed into the validator property of the EntityPublishingProcessor type and is automatically called during the publishing process. Any validation errors will be presented to the user.

const customMovieValidation = async (
  json: unknown
): Promise<SnapshotValidationResult[]> => {
  const yupSchema = Yup.object({
    genre_ids: atLeastOneString,
    images: requiredCover,
    videos: videosValidation("MAIN", "TRAILER"),
    licenses: licensesValidation(true),
  return validateYupPublishSchema(json, yupSchema);

export const publishingMovieProcessor: EntityPublishingProcessor = {
  type: "movies",
  aggregator: movieDataAggregator,
  validator: customMovieValidation,
  validationSchema: MoviePublishedEventSchema,
  publishMessagingSettings: PublishServiceMessagingSettings.MoviePublished,
  unpublishMessagingSettings: PublishServiceMessagingSettings.MovieUnpublished,

In the code above, we use Yup schema validation, along with helper functions like atLeastOneString, requiredCover, etc., to perform the necessary validation. Any errors will be communicated to the user during the publishing process.

By following these guidelines, you can implement a robust system of data validations across your application, ensuring data accuracy, integrity, and a smooth user experience.