



Have any questions? We’re here to help You
Yes, Makini supports both cloud-based and on-premises systems. For on-premises installations, connections require double the connection credits compared to cloud systems. The connection process typically requires opening specific ports and whitelisting Makini's IP addresses in your firewall configuration. For some on-premises systems, VPN tunnels may be necessary. We provide detailed technical requirements during implementation planning. In cases where security policies prohibit external connections, we offer self-hosted deployment options where Makini runs entirely within your infrastructure, eliminating the need for external network access to on-premises systems.
Testing should cover authentication, data retrieval, data writing, error handling, and workflow logic. Start by connecting a test system through Makini's authentication flow. Use sandbox or non-production instances of your target systems when available. Test API calls for each entity type you'll use (purchase orders, work orders, etc.) to verify data mapping and field coverage. Test error scenarios by providing invalid inputs or attempting operations without proper permissions. For workflow-based integrations, test each workflow step independently before testing end-to-end. Verify webhook delivery and signature verification. Test with realistic data volumes to identify performance issues. Include tests for connection failure scenarios and verify your error handling and retry logic work correctly.
Makini provides sandbox connections for testing without affecting production systems. Sandbox connections include sample data representing common scenarios: standard purchase orders, orders with custom fields, orders in various states (draft, approved, completed), and error cases like invalid vendors or out-of-stock items. Sandbox data is read-only for safety—write operations return success responses without modifying data. This allows thorough testing of your integration logic without risk. For testing with specific systems, we recommend using dedicated test instances of the actual systems (like SAP sandbox environments) connected through Makini, which provides the most realistic testing experience.
Our standard SLA targets 99.9% uptime for cloud deployments, which translates to less than 9 hours of downtime per year. For enterprise customers with critical integration requirements, we offer enhanced SLAs up to 99.99% through multi-region redundancy and dedicated infrastructure. SLAs cover the Makini platform itself—availability of connected third-party systems is outside our control, though we monitor their health and alert you to issues. For self-hosted deployments, uptime depends on your infrastructure configuration, and we provide architecture guidance to help you achieve your availability targets. We maintain a public status page showing real-time system health and incident history.
