Engine
The way to generate a Tartiflette engine is pretty simple, most of the time, you will use the create_engine function, exposed in the tartiflette package. This function performs all the necessary tasks needed to build your engine.
create_engine prepares and cooks your engine
create_engine is the easiest and quickiest way to instanciate and build a Tartiflette engine. Behind the scene, this factory will implements the regular cooking process.
Using the SDL (Schema Definition Language) parameter with different types
When the sdl parameter contains the SDL as a raw string
from tartiflette import create_engine
engine = await create_engine(
"""
type Query {
hello: String
}
"""
)
When the sdl parameter contains a path to a file
The file path specified has to contain the full Schema Definition Language.
from tartiflette import create_engine
engine = await create_engine(
"/User/chuck/workspace/mytartiflette/schema.graphql"
)
When the sdl parameter contains a list of file path
Every file will be concatenated, in the order of the provided list.
from tartiflette import create_engine
engine = await create_engine(
[
"/User/chuck/workspace/mytartiflette/schema_query.graphql",
"/User/chuck/workspace/mytartiflette/schema_mutation.graphql",
]
)
When the sdl parameter contains a path to a directory
Every file which ends with .graphql (or .sdl) will be concatenated in lexicographical order.
from tartiflette import create_engine
engine = await create_engine("/User/chuck/workspace/mytartiflette")
Advanced constructor
The create_engine function provides an advanced interface for initialization. It accepts multiple parameters:
sdl(Union[str, List[str]]): raw SDL, path or list of file path/directory from which retrieve the SDL (more detail above)schema_name(str = "default"): name of the schema represented by the provided SDL (more detail here)error_coercer(Callable[[Exception, Dict[str, Any]], Dict[str, Any]]): callable used to coerced an exception into a GraphQL valid output format (more detail here)custom_default_resolver(Optional[Callable]): callable used to resolve fields which doesn't implements a dedicated resolver (useful if you want to override the behavior for resolving a field, e.g. fromsnake_casetocamelCaseand vice versa) (more detail here)custom_default_type_resolver(Optional[Callable]): callable that will replace the tartiflettedefault_type_resolver(will be called on abstract types to deduct the type of a result) (more detail here)modules(Optional[Union[str, List[str], List[Dict[str, Any]]]]): list of string containing the name of the modules you want the engine to import, usually this modules contains your@Resolvers,@Directives,@Scalaror@Subscriptioncode (more detail here)query_cache_decorator(Optional[Callable]): callable that will replace the tartiflette default lru_cache decorator to cache query parsingjson_loader(Optional[Callable[[str], Dict[str, Any]]]): a Callable that will replace python built-injson.loadswhen Tartiflette will transform the json-ast of the query into a dict useable by the execution algorithm. (more detail here)custom_default_arguments_coercer(Optional[Callable]): callable that will replace the tartiflettedefault_arguments_coercercoerce_list_concurrently(Optional[bool]): determine whether or not output list are coerced concurrently by default
Parameter: error_coercer
The main objective of the error_coercer is to provide you a way to extend the behavior when an exception is raised into Tartiflette.
For instance:
- add a log entry when a third-party exceptions is raised (e.g
pymsql,redis). - hide technical message's exception for production environment (don't expose your internal stack to the outside world)
error_coercer SHOULDN'T be used for custom functional exception, for this common use-case, please take a look of the TartifletteError and its documentation's page.
import logging
from typing import Any, Dict
from tartiflette import create_engine
logger = logging.getLogger(__name__)
class CustomException(Exception):
def __init__(self, type_name: str, message: str) -> None:
self.type = type_name
self.message = message
async def my_error_coercer(
exception: Exception, error: Dict[str, Any]
) -> Dict[str, Any]:
if isinstance(exception, CustomException):
logger.error("Unable to reach the Storage host.")
error["extensions"]["type"] = exception.type
return error
engine = await create_engine(
"my_sdl.graphql",
error_coercer=my_error_coercer,
)
Parameter: custom_default_resolver
The custom_default_resolver parameter is here to provide an easy way to override the default resolver used internaly by Tartiflette during the execution. The default resolver is the resolver which is used for each field which doesn't implements a dedicated resolver (meaning a field which doesn't implement a callable decorated with @Resolver). It can be useful to override the behavior for resolving a field, for instance from snake_case to camelCase and vice versa.
from tartiflette import create_engine
async def my_default_resolver(parent, arguments, context, info):
do_ing_some_thin_gs = 42
return a_value
engine = await create_engine(
"my_sdl.graphql",
custom_default_resolver=my_default_resolver,
)
Parameter: custom_default_type_resolver
The custom_default_type_resolver parameter is here to provide an easy way to override the default type resolver used internaly by Tartiflette during the execution. The default type resolver is the resolver which is used for each abstract field which doesn't implements a dedicated type resolver. It can be useful to override the behavior for resolving the __typename of a field.
from tartiflette import create_engine
async def my_default_type_resolver(result, ctx, info, abstract_type):
return parent["__typename"]
engine = await create_engine(
"my_sdl.graphql",
custom_default_type_resolver=my_default_type_resolver,
)
Parameter: modules
Prior creating the Tartiflette engine, all your code must be decoratored by these following ones to be taken into account:
@Resolver@TypeResolver@Subscription@Scalar@Directive
Doing it by yourself could be verbose and generate a lot of imports.
Both for your internal code and the plugins management, Tartiflette provides a modules parameter which give you the ability to specify all the internal and external modules you want to import. In addition to the module, you will be able to specify a configuration, which will be mostly used by the Tartiflette plugin approach.
This allow you to have a cleaner code by doing this:
from tartiflette import create_engine
engine = await create_engine(
os.path.dirname(os.path.abspath(__file__)) + "/sdl",
modules=[
"recipes_manager.query_resolvers",
"recipes_manager.mutation_resolvers",
"recipes_manager.subscription_resolvers",
"recipes_manager.directives.auth",
"recipes_manager.directives.rate_limiting",
],
)
instead of:
import recipes_manager.query_resolvers
import recipes_manager.mutation_resolvers
import recipes_manager.subscription_resolvers
import recipes_manager.directives.auth
import recipes_manager.directives.rate_limiting
engine = await create_engine(
os.path.dirname(os.path.abspath(__file__)) + "/sdl"
)
Giving configuration to a module
As explain above, the modules parameter can be used to provide a list of modules to import but sometimes you will need to provide some configuration to some modules. In order to do that, instead of providing a simple string which target to the module, you will have to fill in a dictionnary with a name parameter which target to the module and a config key which will contains the configuration needed by the module:
from tartiflette import create_engine
engine = await create_engine(
os.path.dirname(os.path.abspath(__file__)) + "/sdl",
modules=[
"recipes_manager.query_resolvers",
"recipes_manager.mutation_resolvers",
{"name": "a.module.that.needs.config", "config": {"key": "value"}},
{"name": "b.module.that.needs.config", "config": {"key": "value"}},
],
)
Parameter: query_cache_decorator
The query_cache_decorator parameter is here to provide an easy way to override the default cache decorated used internaly by Tartiflette over the parsing of queries.
The default cache decorator use the functools.lru_cache function with a maxsize to 512.
If necessary, you can change this behavior by providing your own decorator to cache query parsing or disable the cache by providing the None value to this parameter.
Here is an example of a custom decorator using lru_cache with a maxsize of 1024 instead of the default 512:
from functools import lru_cache
from typing import Callable
from tartiflette import create_engine
def my_cache_decorator(func: Callable) -> Callable:
return lru_cache(maxsize=1024)(func)
engine = await create_engine(
"my_sdl.graphql",
query_cache_decorator=my_cache_decorator,
)
Parameter: json_loader
This parameter enables you to use another json lib for ast-json loading (happens around here).
Example usage could be to change the json lib:
import rapidjson
engine = await create_engine(*
os.path.dirname(os.path.abspath(__file__)) + "/sdl",
modules=[
"recipes_manager.query_resolvers",
"recipes_manager.mutation_resolvers",
{"name": "a.module.that.needs.config", "config": {"key": "value"}},
{"name": "b.module.that.needs.config", "config": {"key": "value"}},
],
json_loader=rapidjson.loads
)
or to give more arguments to the built-in python loader.
from functools import partial
import json
engine = await create_engine(
os.path.dirname(os.path.abspath(__file__)) + "/sdl",
modules=[
"recipes_manager.query_resolvers",
"recipes_manager.mutation_resolvers",
{"name": "a.module.that.needs.config", "config": {"key": "value"}},
{"name": "b.module.that.needs.config", "config": {"key": "value"}},
],
json_loader=partial(json.loads, parse_float=..., object_hook=...)
)
Parameter: custom_default_arguments_coercer
The custom_default_arguments_coercer parameter is here to provide an easy way to override the default callable used internaly by Tartiflette to coerce arguments. The default arguments coercer use the asyncio.gather function to coerce asynchronously the arguments. It can be useful to override this behavior to change this behavior. For instance, you could use the sync_arguments_coercer in order to coerce your arguments synchronously and avoid the creation of too many asyncio tasks.
from typing import Any, List
from tartiflette import create_engine
async def my_default_arguments_coercer(*coroutines) -> List[Exception, Any]:
results = []
for coroutine in coroutines:
try:
result = await coroutine
except Exception as e: # pylint: disable=broad-except
result = e
results.append(result)
return results
engine = await create_engine(
"my_sdl.graphql",
custom_default_arguments_coercer=my_default_arguments_coercer,
)
Advanced instanciation
For those who want to integrate Tartiflette in advanced use-cases. You could be interested by owning the process of building an Engine.
Why owning the cooking (building) process of Tartiflette?
The cooking process of Tartiflette is equals to a build process on another library, it will prepare your engine to be executed. Thus, it's useless to say that an engine instance can not be executed without beeing cooked. Like the meal, you can not eat a tartiflette without cooking it first. That's it.
Customize the cooking process is interesting to integrate Tartiflette into another library, like aiohttp, starlette, django and so one.
Note:
tartiflette-aiohttphas its own flow to manage the cooking process of the engine.
cook() your Tartiflette
As specified above, the Engine needs to be cook() before being executed. Here are the sequence to execute a query on an Engine instance.
from tartiflette import Engine
# 1. Create an instance of the Engine
engine = Engine()
# 2. Cook (build) the engine to prepare it to be executed
await engine.cook(
"""
type Query {
hello: String
}
"""
)
# 3. Execute a GraphQL Query
await engine.execute("query { hello )")
cook() interface
The cook method is asynchronous, this strong choice will allow us to execute asynchronous tasks during the building process, like:
- fetching SDL from another API
- fetch third-parties services (database structure, cloud provider objects ...)
- fetch schema from a schema manager
async def cook(
self,
sdl: Union[str, List[str]] = None,
error_coercer: Callable[[Exception, Dict[str, Any]], Dict[str, Any]] = None,
custom_default_resolver: Optional[Callable] = None,
modules: Optional[Union[str, List[str], List[Dict[str, Any]]]] = None,
query_cache_decorator: Optional[Callable] = UNDEFINED_VALUE,
json_loader: Optional[Callable[[str], Dict[str, Any]]] = None,
custom_default_arguments_coercer: Optional[Callable] = None,
coerce_list_concurrently: Optional[bool] = None,
schema_name: str = None,
) -> None:
pass
sdl(Union[str, List[str]]): raw SDL, path or list of file path/directory from which retrieve the SDL (more detail above)error_coercer(Callable[[Exception, Dict[str, Any]], Dict[str, Any]]): callable used to coerced an exception into a GraphQL valid output format (more detail here)custom_default_resolver(Optional[Callable]): callable used to resolve fields which doesn't implements a dedicated resolver (useful if you want to override the behavior for resolving a field, e.g. fromsnake_casetocamelCaseand vice versa) (more detail here)custom_default_type_resolver(Optional[Callable]): callable that will replace the tartiflettedefault_type_resolver(will be called on abstract types to deduct the type of a result) (more detail here)modules(Optional[Union[str, List[str], List[Dict[str, Any]]]]): list of string containing the name of the modules you want the engine to import, usually this modules contains your@Resolvers,@Directives,@Scalaror@Subscriptioncode (more detail here)query_cache_decorator(Optional[Callable]): callable that will replace the tartiflette default lru_cache decorator to cache query parsingjson_loader(Optional[Callable[[str], Dict[str, Any]]]): a Callable that will replace python built-injson.loadswhen Tartiflette will transform the json-ast of the query into a dict useable by the execution algorithm. (more detail here)custom_default_arguments_coercer(Optional[Callable]): callable that will replace the tartiflettedefault_arguments_coercercoerce_list_concurrently(Optional[bool]): determine whether or not output list are coerced concurrently by defaultschema_name(str = "default"): name of the schema represented by the provided SDL (more detail here)