Skip to content

Latest commit

 

History

History
287 lines (202 loc) · 8.06 KB

1-add-custom-logic.mdx

File metadata and controls

287 lines (202 loc) · 8.06 KB
sidebar_position sidebar_label description keywords
2
Add custom logic
Learn how to add simple custom business logic to your API using lambda connectors and build on this example with your own custom logic.
hasura
hasura ddn
cicd
api deployment
business logic

import Tabs from "@theme/Tabs"; import TabItem from "@theme/TabItem";

Add Custom Logic

Introduction

In this tutorial, you'll use a lambda connector to add custom business logic to your supergraph API. The connector and DDN CLI will automatically determine the argument and return types from your custom logic and add them to your API's GraphQL schema.

Each connector uses its own conventions to determine whether a function should be exposed as a query or mutation; in this tutorial, you'll add both.

This tutorial should take about five minutes.

Step 1. Initialize a new local DDN project

ddn supergraph init lambda-tutorial

Step 2. Initialize the lambda connector

ddn connector init my_ts -i
  • Select hasura/nodejs from the list of connectors.
  • Choose a port (press enter to accept the default recommended by the CLI).

If you open the app/connector/my_ts directory, you'll see the functions.ts file generated by the CLI; this will be the entrypoint for your connector.

ddn connector init my_python -i
  • Select hasura/python from the list of connectors.
  • Choose a port (press enter to accept the default recommended by the CLI).

If you open the app/connector/my_python directory, you'll see the functions.py file generated by the CLI; this will be the entrypoint for your connector.

ddn connector init my_go -i
  • Select hasura/go from the list of connectors.
  • Choose a port (press enter to accept the default recommended by the CLI).

If you open the app/connector/my_go directory, you'll see Go files in the functions folder; these will serve as the entrypoint for your connector.

Step 3. Add custom logic

cd app/connector/my_ts && npm install
/**
 * @readonly Exposes the function as an NDC function (the function should only query data without making modifications)
 */
export function hello(name?: string) {
  return `hello ${name ?? "world"}`;
}

/**
 * As this is missing the readonly tag, this will expose the function as an NDC procedure (the function will be exposed as a mutation in the API)
 */
export function encode(username: string) {
  return Buffer.from(username).toString("base64");
}

Including the @readonly tag ensures hello() is exposed as a query in our API. By omitting this from encode(), we're ensuring this second function is exposed as a mutation when we build our API.

Both have typed input arguments and implicitly return strings, which the connector will use to generate the corresponding GraphQL schema.

from hasura_ndc import start
from hasura_ndc.function_connector import FunctionConnector
from pydantic import BaseModel, Field
from hasura_ndc.errors import UnprocessableContent
from typing import Annotated
import base64


connector = FunctionConnector()

@connector.register_query
def hello(name: str) -> str:
    return f"Hello {name}"

@connector.register_mutation
def encode(username: str) -> str:
    return base64.b64encode(username.encode("utf-8")).decode("utf-8")


if __name__ == "__main__":
    start(connector)

Using the @connector.register_query decorator ensures hello() is exposed as a query in our API. By using the @connector.register_mutation decorator with encode(), we're ensuring this second function is exposed as a mutation when we build our API.

Both have typed input arguments and return strings, which the connector will use to generate the corresponding GraphQL schema.

package functions

import (
	"context"
	"encoding/base64"

	"hasura-ndc.dev/ndc-go/types"
)

// EncodeUsernameArguments represents the input arguments for encoding a username.
type EncodeUsernameArguments struct {
	Username string `json:"username"`
}

// EncodeUsernameResult represents the result of the Base64 encoding.
type EncodeUsernameResult struct {
	EncodedUsername string `json:"encodedUsername"`
}

// ProcedureEncodeUsername encodes the given username into Base64 format.
func ProcedureEncode(ctx context.Context, state *types.State, arguments *EncodeUsernameArguments) (*EncodeUsernameResult, error) {
	encoded := base64.StdEncoding.EncodeToString([]byte(arguments.Username))
	return &EncodeUsernameResult{
		EncodedUsername: encoded,
	}, nil
}

Using the prefix Procedure ensures ProcedureEncode() is exposed as a mutation in our API. If you open the hello.go file in the functions directory, you'll notice the FunctionHello() is prefixed differently, identifying it as a function to be exposed as a query in your API.

Both have typed input arguments and return strings, which the connector will use to generate the corresponding GraphQL schema.

Step 4. Introspect the source file(s)

ddn connector introspect my_ts
# alternatively, use ddn command add my_ts "*" for bulk adds
ddn command add my_ts hello
ddn command add my_ts encode
ddn connector introspect my_python
# alternatively, use ddn command add my_python "*" for bulk adds
ddn command add my_python hello
ddn command add my_python encode
ddn connector introspect my_go
# alternatively, use ddn command add my_go "*" for bulk adds
ddn command add my_go hello
ddn command add my_go encode

The commands introspected your connector's entrypoint, identified functions with their argument and return types, and generated Hasura metadata for each. Look for Hello.hml and Encode.hml to see the CLI-generated metadata.

Step 5. Create a new build and test

ddn supergraph build local
ddn run docker-start
ddn console --local
# Which will return hello, Hasura
query HelloQuery {
  hello(name: "Hasura")
}

:::tip Using the Go connector?

The boilerplate hello() function requires an argument of greeting and for you to include a return type in the query.

:::

# Which will return aGFzdXJh
mutation EncodeMutation {
  encode(username: "hasura")
}

Next steps

Now that you've seen how easy it is to add custom business logic directly to your supergraph API, consider these next steps: