How to configure a simple connection which makes a POST request to an API endpoint?
As POST requests go, a user needs to define the URL of the endpoint/resource, a method (POST) and some body of the request (data). To translate this into DSYNC terms, we need to create a destination API endpoint where we configure all three components of the request, as well as an arbitrary source endpoint which will launch the request itself.
1) Create a container system for destination endpoint
API endpoints are not standalone endpoints in DSYNC and require a container system. You can add a destination API endpoint into an existing system you have created from +SYSTEM wizard as well as all manually created system. For the purpose of this exercise we will create a new system manually.
Top right corner - select '+SYSTEM' and 'Create system manually'.
Click on the canvas where you want to place the new system. Enter the name and confirm. Your new system is placed on the canvas.
2) Add destination API endpoint
Select the system you have created in previous step by clicking on its' name. Notice the right sidebar options now show system related actions. Select 'Add endpoint'
Name your endpoint, select 'DESTINATION' type and 'API' connector, and confirm.
3) Configure API endpoint
Select the endpoint on the canvas by clicking on its' name - right sidebar now shows the corresponding endpoint settings.
The URL, method, and headers can be configured directly from the right sidebar. Enter details as required and save your settings.
In DSYNC terms, the body of a request is a data layout. Each endpoint which work with data must have its data layout defined. More information about data layouts can be found here.
In this exercise we will be sending a single field 'foo' into the destination API therefore we need to add this field into our data layout.
Double-click the endpoint to get to the data layout page. Select JSON type, then add a new field and name it 'foo'. Save your changes when done.
Your data layout should look like the above showing the JSON file format selected and one field of text data type called 'foo'.
4) Add source endpoint
To actually run/test the POST request we need to create a job. For this we will need a source endpoint.
Add a manual Upload endpoint onto the canvas via the '+ENDPOINT' button at top right and 'Upload'.
Double-click the endpoint and configure its data layout - select JSON data type again and add one field called 'foo'. The data layout should look like the one below.
Save your settings and return to the main canvas.
5) Create job
Now you have a source and destination endpoint available on your canvas. Both endpoints have their data layouts and settings configured as required so next step is to link them to create a job which can be run.
Click on the source endpoint to select it. Right sidebar - Click on '+CREATE LINK' button. Finally, click on the destination endpoint.
A pop-up will show listing all available/saved mapping templates. Scroll all the way down and click the 'NEW' button. We want to create a new mapping for this job based on our data layouts. This creates a brand new mapping template and opens a mapping page.
In mapping template you can define the output for each individual field on your destination data layout. You can map one or more fields from your source data onto a destination. You can also enter static (hard-coded) values instead of source, and apply different functions to transform data before it is sent to destination.
Click where it says "click to add field" to open up the available source fields to map.
Select the 'foo' field to map it onto the destination field, close the pop-up and save the mapping.
6) Run job
You should now have both endpoints on the canvas connected by a link. To run the job, simply right click the middle point of the link and select 'Run' from the menu. You can also select the job by clicking on the middle point of the link and hit the 'RUN' button from right sidebar.
As the source endpoint is a manual Upload source you will be prompted to upload a file. The file type must be as per the source data layout - JSON file in our case - and needs to have appropriate extension. Also, the data must conform to the source data layout. See attached 'test.json' file - use this file to run the job.
Once the file is uploaded the process starts. Source data is parsed and mapped as per your mapping template - the value of 'foo' field from source file will be mapped onto a 'foo' destination field. This will become the body of the POST request as sent to the destination API endpoint which you have configured.
To check what is happening behind the scenes you can view the job's debug route. For more information on how to access job's debug route please refer to this article.
There are many other options available to you in DSYNC for connecting APIs and this was just a simple example of a POST call to an API. DSYNC has support for all http methods, various types of authentication (basic, oAuth1, oAuth2, ...), remote execution of jobs, chaining of individual jobs via callbacks, powerful mapping engine which allows you to transform the layout/schema as well as the data itself on the fly, and many more functions. If you would like us to add new functionality in please let us know by submitting a support request.