cynkra


Intelligently R

From natural language interfaces to automated analysis pipelines, we make AI accessible and practical within the R ecosystem.

Our Offer

Supercharge R with AI

No-Code AI Workflows


Our blockr.ai framework lets you analyze data using natural language instead of code. Describe what you want in plain English - create plots, filter data, generate summaries - and watch it happen. Non-technical team members can explore data independently, while developers can extend the system with custom blocks. It bridges the gap between AI capability and everyday data work.

Explore blockr

Local LLMs


We deploy Large Language Models on your own infrastructure. Your data never leaves your servers - essential for sensitive business, financial, or healthcare data. Local models can be fine-tuned to understand your domain terminology and workflows. You control the hardware, avoid per-token cloud costs, and get predictable performance without internet dependency.

Custom AI Integration

We specialize in privacy-aware AI solutions using local Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems. Our solutions keep your data private while providing customized models that understand your domain-specific requirements.

Contact us
decorative background

Featured Projects


A high-performance vector database implementation for R, enabling efficient similarity search and retrieval for AI applications.

Key Features

  • High-performance vector similarity search
  • Efficient indexing for large datasets
  • Integration with R AI workflows
  • Support for multiple distance metrics
open source project

Our ask package lets you interact with AI models directly from R, going beyond simple text responses.

Key Features

  • Script and documentation editing in place
  • Code and test generation
  • Package documentation querying
  • Natural language data processing
  • Support for both cloud (GPT-4) and local (LLama) models
open source project

blockr.ai extends our blockr framework with AI capabilities for natural language-driven data analysis.

Key Features

  • AI-powered plot creation through natural language
  • Intelligent data transformations
  • Integration with leading AI models
  • Composable blocks for flexible workflows
  • Seamless integration with the blockr ecosystem
open source project

Local AI Solutions


  • Complete Data Privacy

    Your data never leaves your servers. We deploy open-source models like gpt-oss (20B and 120B variants) that run entirely on your infrastructure.

  • Customized Models

    We configure and optimize local LLMs to understand your domain-specific terminology, workflows, and requirements - no cloud dependency needed.

  • Performance & Cost Efficiency

    Eliminate per-token API costs. Local deployment means predictable pricing, lower latency, and models that scale with your hardware investment.

AI Insights

Practical examples of working with LLMs in R

post image

David Schoch, Christoph Sax /

R with RAGS: An Introduction to rchroma and ChromaDB

LLM/RAG/R

Large language models (LLMs) are developing rapidly, but they often lack real-time, specific information. Retrieval-augmented generation (RAG) addresses this by letting LLMs fetch relevant documents during text generation, instead of just using their internal—and potentially outdated— knowledge.

post image

Christoph Sax /

Playing with AI Agents in R

LLM/R

It's local LLM time! What an adventure it has been since I first started exploring local LLMs. With the introduction of various new Llama models, we now have impressive small and large models that run seamlessly on consumer hardware.

post image

Christoph Sax /

Playing with Llama 3.1 in R

LLM

Meta recently announced Llama 3.1, and there's a lot of excitement. I finally had some time to experiment with locally run open-source models. The small 8B model, in particular, produces surprisingly useful output, with reasonable speed.