



Have any questions? We’re here to help You
All API requests require authentication via bearer token. After successfully connecting a system through Makini's authentication module, you receive an API token. Include this token in the Authorization header of your requests: `Authorization: Bearer YOUR_API_TOKEN`. Each connection has a unique token, allowing you to manage multiple customer connections independently. Tokens remain valid as long as the underlying system credentials are valid and the connection is active. If a customer changes their system credentials, you'll need to reconnect to obtain a new token.
Design your webhook receiver to handle duplicates and out-of-order webhooks, as network issues or retries can cause both scenarios. Keep the receiver lightweight—ideally writing incoming webhooks to a queue or reliable storage—then process them asynchronously. This prevents timeouts and allows your system to handle high-volume webhook spikes. Respond with a 200 status code immediately after receiving the webhook, before processing begins. Implement idempotency by tracking processed webhook IDs and skipping duplicates. Use constant-time comparison for signature verification to prevent timing attacks. If webhook processing fails, log the error but still return 200 to prevent unnecessary retries. Set up monitoring and alerts for webhook failures so you can investigate issues promptly. For critical workflows, combine webhooks with periodic polling as a fallback mechanism.
Makini uses cursor-based pagination for retrieving large datasets. API responses include a `next_cursor` field when additional results are available. To retrieve the next page, include the cursor value in your next request: `GET /api/v1/purchase-orders?cursor=CURSOR_VALUE`. Cursor-based pagination is more reliable than offset-based pagination because it handles data changes between requests—if records are added or deleted while you're paginating, you won't miss records or see duplicates. Page size is configurable up to a maximum limit (typically 100-500 records per page depending on entity type). For optimal performance, use the largest page size your application can handle efficiently. The API response also includes total count when available from the source system.
Yes, through Makini Flows, which includes connectors for popular databases including Snowflake, PostgreSQL, MySQL, MongoDB, and others. This enables workflows that synchronize data between industrial systems and your data warehouse or analytics platforms. For example, you can sync purchase orders from SAP to Snowflake for analytics, or use database queries to drive integration logic. Database integrations use the same workflow builder as other integrations, making it easy to combine industrial system data with database operations. For direct database-to-database syncing, we can help design optimized workflows. Database connections are treated as custom integrations and may require additional workflow development.
