Recipes For Better Integrations

As organizations standardize on using the cloud, data professionals need to quickly integrate and load cloud systems, data warehouses and data lakes. Many departments and small businesses lack the bandwidth and sometimes the skill set to use existing solutions, causing delays that lead to missed opportunities and poor customer experiences.

Organizations need the agility to accelerate automation, digital transformation and data-driven decision making. However, existing solutions are not designed to easily integrate, transform, aggregate, process and quickly move large amounts of distributed data.

Until Now. Experience the difference with using Lingk Recipes.

 

The Lingk Experience

recipe_templates (1).png

Key Features

  • Powerful data transformation, loading and integration solutions with the Recipe Editor.

  • Accelerate generating recipe solutions with the Visual SQL Wizard.

  • Collaborative and dynamic data workspaces for project management.

  • LingkQL combines the data analysis capabilities of Apache Spark in-memory 100X processing power with integration-focused SQL statements.

  • Connect events using custom webhooks to listeners from workflow tools (like Zapier) or iPaaS platforms (like Mulesoft).

 

Data Loading, Integration and Automation - Simplified.

Visual SQL Wizard

SQL Recipes

lingk_recipe_editor_1.png

Recipes enable many different solutions - batch, streaming, and big data analytics scenarios - to be processed in parallel and in record time. Recipes are human-readable, reusable YAML configurations that combine connector information, data orchestration, and powerful SQL commands to enable almost any type of transformation and aggregation on large sets of data. 

lingk_wizard_fields_1.png

The Visual SQL Wizard enables fast and smart data loading, processing, integration and automation solutions by generating recipes for on-premise or cloud-based systems and data warehouses. The wizard is much easier to use than typical drag-and-drop tools, providing specific guidance for building quality data transformations, pipelines and automations. 

 

Connect Enterprise and Industry

Lingk connectors remove the complexity of connecting to files, cloud and on-premise systems, data warehouses and data lakes. Connectors read data into an in-memory staging table so you can quickly transform, cleanse and write data using recipes.

 
salesforce.png
oracle.png
aws_small.png
microsoft.png
googlecloud.png
 
Transforming hundreds of thousands or millions of records across multiple systems can happen in minutes. For example, syncing and deduping nearly one million records between Peoplesoft and Salesforce happens in less than 20 minutes. 
 

Anatomy of a SQL Recipe

A Lingk SQL Recipe is simply a YAML file configured with SQL statements about "where" data resides and "what" should process and output the data. 

recipe_anatomy.png

Connectors

Connectors read data into an in-memory staging table and write data 100X faster using recipes. Use connectors with recipes to join, aggregate, cleanse and transform datasets. Many connectors can read and write data.

Post-Processors

After data is read by a connector, a post-processor can add immediate value to your dataset. A commonly used post-processor is "data diff" - which compares new data with old data - and helps you process change sets based on new, updated, or deleted records.

Formats

Files can be stored in different file formats and APIs can return different content types. Using formats, you can instruct a recipe on the format to read data in or write data out.

Statements

LingkQL (Lingk SQL) statements represent an opportunity to select data using Apache Spark SQL into new tables or write existing tables out to connectors. Each statement in a recipe represents another step in the recipe solution. Statements can call out to Lingk APIs to trigger other recipes or external workflows via webhooks. 

 

Lingk Platform Components

 
lingk_major_components.png
 

The Lingk Transformer Engine (LTE)  is the Lingk in-memory SQL Recipe Engine that processes the data for the Lingk Platform. Built on top of Apache Spark, the LTE has the ability to power all your ETL, event streaming and big data analytics needs. A Lingk recipe is the way to harness all the power of the LTE and to orchestrate data flows.

In a SaaS deployment, all data flowing through the LTE is “stateless” and in-memory. This means that you never have to worry about Lingk data on our servers. 

A hybrid deployment of LTE is typically used if your organization’s compliance or policy requirements prevent you from using a multi-tenant cloud service. LTE can be deployed in hybrid implementations on Amazon AWS, Google Cloud, Microsoft Azure and Oracle Cloud.

Security by Design

Your data is owned by you and we take careful measures to make your data as safe as possible. We use Amazon Web Services (AWS) for our infrastructure which has achieved both ISO 27001 certification and successfully completed multiple SSAE 16 audits.

Enterprise Support

Have questions? Our team is here to help.

From general inquiries about the Lingk Platform to more technical questions about specific recipe solutions—we’ve got you covered.  We offer standard and premium support level options and include in-app customer communication with our platform.