#Contest

0 Followers · 412 Posts

Contest tag unites posts that are related to any coding competition taking place on InterSystems Developer Community.

Article Andre Larsen Barbosa · May 20, 2025 3m read

image

Just like a knockout punch, without giving the opponent a chance, Kubernetes, as an open source platform, has a universe of opportunities due to its availability (i.e., the ease of finding support, services and tools). It is a platform that can manage jobs and services in containers, which greatly simplifies the configuration and automation of these processes.

But let's justify the title image and give the tool in question the “correct” name: InterSystems Kubernetes Operator.

The principle is basic, you choose the services and define the rules for the game (here referring to Knockout again), and everything will be provided in the most transparent and efficient way possible, and this applies to installation, repair or eventual restoration, when this does not meet the pre-defined requirements.

But, what differentiates IKO from any other operator? As an extension of the Kubernetes API (let's call it K8s for short), the IrisCluster custom component, which has the options of deploying as a locked IRIS cluster, a distributed Caché cluster or even an anonymous instance. All of this on the most diverse Kubernetes platforms available. Last but not least, it also includes InterSystems cluster management features, which allow you to automate some tasks by adding nodes, which could previously only be done manually.

This is all very nice, it made reference to some sport or game, but why do I need it? The answer is relatively simple. I don't need IrisCluster to bring InterSystems IRIS to K8s, however, since K8s is a standalone application, it would be necessary to create the definitions and eventual scripts to configure these IRIS instances. In this way, IKO automates this process, facilitating maintenance. Using containers is a great way to package this collection of activities that need to happen.

However, taking advantage of the opportunity, do you know what a container is? A hint, it's not just a board game.

Container, the Shipping Container Board Game, Image image

The answer has much more to do with the “transportation” of some package, since it packages and isolates applications and services, so that they can be executed separately from the rest. Thus facilitating the “transportation” from one environment to another, according to the need.

Taking advantage of the vast documentation from InterSystems, below is the link to the installation of IKO and subsequent configuration and adjustment steps.

https://docs.intersystems.com/components/csp/docbook/DocBook.UI.Page.cls?KEY=AIKO#AIKO_install

So that no one is curious about the nickname K8s. The origin of the name Kubernetes comes from the Greek, which means nothing more, nothing less than a pilot or helmsman, in short, the one who directs. And, the number of characters between the letters “K” at the beginning and “S” at the end, is 8. Thus, “K8s”.

0
1 79
Article Maria Nesterenko · Mar 14, 2024 7m read

Artificial Intelligence (AI) is getting a lot of attention lately because it can change many areas of our lives. Better computer power and more data have helped AI do amazing things, like improving medical tests and making self-driving cars. AI can also help businesses make better decisions and work more efficiently, which is why it's becoming more popular and widely used. How can one integrate the OpenAI API calls into an existing IRIS Interoperability application?

 

6
4 582
Article Henry Pereira · Apr 2, 2025 17m read

Image generated by OpenAI DALL·E

I'm a huge sci-fi fan, but while I'm fully onboard the Star Wars train (apologies to my fellow Trekkies!), but I've always appreciated the classic episodes of Star Trek from my childhood. The diverse crew of the USS Enterprise, each masterminding their unique roles, is a perfect metaphor for understanding AI agents and their power in projects like Facilis. So, let's embark on an intergalactic mission, leveraging AI as our ship's crew and  boldly go where no man has gone before!  This teamwork concept is a wonderful analogy to illustrate how AI agents work and how we use them in our DC-Facilis project. So, let’s dive in and assume the role of a starship captain, leading an AI crew into unexplored territories!

Welcome to CrewAI!

To manage our AI crew, we use a fantastic framework called CrewAI. It's lean, lightning-fast, and operates as a multi-agent Python platform. One of the reasons we love it, besides the fact that it was created by another Brazilian, is its incredible flexibility and role-based design.

from crewai import Agent, Task, Crew

the taken quote

Meet the Planners

In Facilis, our AI agents are divided into two groups. Let's start with the first one I like to call "The Planners."

The Extraction Agent

The main role of Facilis is to take a natural language description of a REST service and auto-magically create all the necessary interoperability. So, our first crew member is the Extraction Agent. This agent is tasked with "extracting" API specifications from a user's prompt description.

Here's what the Extraction Agent looks out for:

  • Host (required)
  • Endpoint (required)
  • Params (optional)
  • Port (if available)
  • JSON model (for POST/PUT/PATCH/DELETE)
  • Authentication (if applicable)
    def create_extraction_agent(self) -> Agent:
        return Agent(
            role='API Specification Extractor',
            goal='Extract API specifications from natural language descriptions',
            backstory=dedent("""
                You are specialized in interpreting natural language descriptions
                and extracting structured API specifications.
            """),
            allow_delegation=True,
            llm=self.llm
        )

    def extract_api_specs(self, descriptions: List[str]) -> Task:
        return Task(
            description=dedent(f"""
                Extract API specifications from the following descriptions:
                {json.dumps(descriptions, indent=2)}
                
                For each description, extract:
                - host (required)
                - endpoint (required)
                - HTTP_Method (required)
                - params (optional)
                - port (if available)
                - json_model (for POST/PUT/PATCH/DELETE)
                - authentication (if applicable)
                
                Mark any missing required fields as 'missing'.
                Return results in JSON format as an array of specifications.
            """),
            expected_output="""A JSON array containing extracted API specifications with all required and optional fields""",
            agent=self.extraction_agent
        )

The Validation Agent

Next up, the Validation Agent! Their mission is to ensure that the API specifications gathered by the Extraction Agent are correct and consistent. They check:

  1. Valid host format
  2. Endpoint starting with '/'
  3. Valid HTTP methods (GET, POST, PUT, DELETE, PATCH)
  4. Valid port number (if provided)
  5. JSON model presence for applicable methods.

    def create_validation_agent(self) -> Agent:
        return Agent(
            role='API Validator',
            goal='Validate API specifications for correctness and consistency',
            backstory=dedent("""
                You are an expert in API validation, ensuring all specifications
                meet the required standards and format.
            """),
            allow_delegation=False,
            llm=self.llm
        )

 def validate_api_spec(self, extracted_data: Dict) -> Task:
        return Task(
            description=dedent(f"""
                Validate the following API specification:
                {json.dumps(extracted_data, indent=2)}
                
                Check for:
                1. Valid host format
                2. Endpoint starts with '/'
                3. Valid HTTP method (GET, POST, PUT, DELETE, PATCH)
                4. Valid port number (if provided)
                5. JSON model presence for POST/PUT/PATCH/DELETE methods
                
                Return validation results in JSON format.
            """),
            expected_output="""A JSON object containing validation results with any errors or confirmation of validity""",
            agent=self.validation_agent
        )

The Interaction Agent

Moving on, we meet the Interaction Agent, our User Interaction Specialist. Their role is to obtain any missing API specification fields that were marked by the Extraction Agent and validate them based on the Validation Agent's findings. They interact directly with users to fill any gaps.

The Production Agent

We need two crucial pieces of information to create the necessary interoperability: namespace and production name. The Production Agent engages with users to gather this information, much like the Interaction Agent.

The Documentation Transformation Agent

Once the specifications are ready, it's time to convert them into OpenAPI documentation. The Documentation Transformation Agent, an OpenAPI specialist, takes care of this.

    def create_transformation_agent(self) -> Agent:
        return Agent(
            role='OpenAPI Transformation Specialist',
            goal='Convert API specifications into OpenAPI documentation',
            backstory=dedent("""
                You are an expert in OpenAPI specifications and documentation.
                Your role is to transform validated API details into accurate
                and comprehensive OpenAPI 3.0 documentation.
            """),
            allow_delegation=False,
            llm=self.llm
        )

    def transform_to_openapi(self, validated_endpoints: List[Dict], production_info: Dict) -> Task:
        return Task(
            description=dedent(f"""
                Transform the following validated API specifications into OpenAPI 3.0 documentation:
                
                Production Information:
                {json.dumps(production_info, indent=2)}
                
                Validated Endpoints:
                {json.dumps(validated_endpoints, indent=2)}
                
                Requirements:
                1. Generate complete OpenAPI 3.0 specification
                2. Include proper request/response schemas
                3. Document all parameters and request bodies
                4. Include authentication if specified
                5. Ensure proper path formatting
                
                Return the OpenAPI specification in both JSON and YAML formats.
            """),
            expected_output="""A JSON object containing the complete OpenAPI 3.0 specification with all endpoints and schemas""",
            agent=self.transformation_agent
        )

The Review Agent

After transformation, the OpenAPI documentation undergoes a meticulous review to ensure compliance and quality. The Review Agent follows this checklist:

  1. OpenAPI 3.0 Compliance
  • Correct version specification
  • Required root elements
  • Schema structure validation
  1. Completeness
  • All endpoints documented
  • Fully specified parameters
  • Defined request/response schemas
  • Properly configured security schemes
  1. Quality Checks
  • Consistent naming conventions
  • Clear descriptions
  • Proper use of data types
  • Meaningful response codes
  1. Best Practices
  • Proper tag usage
  • Consistent parameter naming
  • Appropriate security definitions

Finally, if everything looks good, the Review Agent reports a healthy JSON object with the following structure:

{
 "is_valid": boolean,
 "approved_spec": object (the reviewed and possibly corrected OpenAPI spec),
 "issues": [array of strings describing any issues found],
 "recommendations": [array of improvement suggestions]
}

    def create_reviewer_agent(self) -> Agent:
        return Agent(
            role='OpenAPI Documentation Reviewer',
            goal='Ensure OpenAPI documentation compliance and quality',
            backstory=dedent("""
                You are the final authority on OpenAPI documentation quality and compliance.
                With extensive experience in OpenAPI 3.0 specifications, you meticulously
                review documentation for accuracy, completeness, and adherence to standards.
            """),
            allow_delegation=True,
            llm=self.llm
        )


    def review_openapi_spec(self, openapi_spec: Dict) -> Task:
        return Task(
            description=dedent(f"""
                Review the following OpenAPI specification for compliance and quality:
                
                {json.dumps(openapi_spec, indent=2)}
                
                Review Checklist:
                1. OpenAPI 3.0 Compliance
                - Verify correct version specification
                - Check required root elements
                - Validate schema structure
                
                2. Completeness
                - All endpoints properly documented
                - Parameters fully specified
                - Request/response schemas defined
                - Security schemes properly configured
                
                3. Quality Checks
                - Consistent naming conventions
                - Clear descriptions
                - Proper use of data types
                - Meaningful response codes
                
                4. Best Practices
                - Proper tag usage
                - Consistent parameter naming
                - Appropriate security definitions
                
                You must return a JSON object with the following structure:
                {{
                    "is_valid": boolean,
                    "approved_spec": object (the reviewed and possibly corrected OpenAPI spec),
                    "issues": [array of strings describing any issues found],
                    "recommendations": [array of improvement suggestions]
                }}
            """),
            expected_output="""A JSON object containing: is_valid (boolean), approved_spec (object), issues (array), and recommendations (array)""",
            agent=self.reviewer_agent
        )

The Iris Agent

The last agent in the planner group is the Iris Agent, who sends the finalized OpenAPI documentation to Iris.


    def create_iris_i14y_agent(self) -> Agent:
        return Agent(
            role='Iris I14y Integration Specialist',
            goal='Integrate API specifications with Iris I14y service',
            backstory=dedent("""
                You are responsible for ensuring smooth integration between the API
                documentation system and the Iris I14y service. You handle the
                communication with Iris, validate responses, and ensure successful
                integration of API specifications.
            """),
            allow_delegation=False,
            llm=self.llm
        )

    def send_to_iris(self, openapi_spec: Dict, production_info: Dict, review_result: Dict) -> Task:
        return Task(
            description=dedent(f"""
                Send the approved OpenAPI specification to Iris I14y service:

                Production Information:
                - Name: {production_info['production_name']}
                - Namespace: {production_info['namespace']}
                - Is New: {production_info.get('create_new', False)}

                Review Status:
                - Approved: {review_result['is_valid']}
                
                Return the integration result in JSON format.
            """),
            expected_output="""A JSON object containing the integration result with Iris I14y service, including success status and response details""",
            agent=self.iris_i14y_agent
        )

class IrisI14yService:
    def __init__(self):
        self.logger = logging.getLogger('facilis.IrisI14yService')
        self.base_url = os.getenv("FACILIS_URL", "http://dc-facilis-iris-1:52773") 
        self.headers = {
            "Content-Type": "application/json"
        }
        self.timeout = int(os.getenv("IRIS_TIMEOUT", "504"))  # in milliseconds
        self.max_retries = int(os.getenv("IRIS_MAX_RETRIES", "3"))
        self.logger.info("IrisI14yService initialized")

    async def send_to_iris_async(self, payload: Dict) -> Dict:
        """
        Send payload to Iris generate endpoint asynchronously
        """
        self.logger.info("Sending payload to Iris generate endpoint")
        if isinstance(payload, str):
            try:
                json.loads(payload)  
            except json.JSONDecodeError:
                raise ValueError("Invalid JSON string provided")
        
        retry_count = 0
        last_error = None

        # Create timeout for aiohttp
        timeout = aiohttp.ClientTimeout(total=self.timeout / 1000)  # Convert ms to seconds

        while retry_count < self.max_retries:
            try:
                self.logger.info(f"Attempt {retry_count + 1}/{self.max_retries}: Sending request to {self.base_url}/facilis/api/generate")
                
                async with aiohttp.ClientSession(timeout=timeout) as session:
                    async with session.post(
                        f"{self.base_url}/facilis/api/generate",
                        json=payload,
                        headers=self.headers
                    ) as response:
                        if response.status == 200:
                            return await response.json()
                        response.raise_for_status()

            except asyncio.TimeoutError as e:
                retry_count += 1
                last_error = e
                error_msg = f"Timeout occurred (attempt {retry_count}/{self.max_retries})"
                self.logger.warning(error_msg)
                
                if retry_count < self.max_retries:
                    wait_time = 2 ** (retry_count - 1)
                    self.logger.info(f"Waiting {wait_time} seconds before retry...")
                    await asyncio.sleep(wait_time)
                continue

            except aiohttp.ClientError as e:
                error_msg = f"Failed to send to Iris: {str(e)}"
                self.logger.error(error_msg)
                raise IrisIntegrationError(error_msg)

        error_msg = f"Failed to send to Iris after {self.max_retries} attempts due to timeout"
        self.logger.error(error_msg)
        raise IrisIntegrationError(error_msg, last_error)

Meet the Generators

Our second set of agents are - the Generators. They are here to transform the OpenAPI specifications into InterSystems IRIS interoperability. There are eight of them in this group.

The first one is the Analyzer Agent. He's like the planner, mapping out the route. Its job is to delve into the OpenAPI specs and figure out what IRIS Interoperability components are needed.


    def create_analyzer_agent():
        return Agent(
            role="OpenAPI Specification Analyzer",
            goal="Thoroughly analyze OpenAPI specifications and plan IRIS Interoperability components",
            backstory="""You are an expert in both OpenAPI specifications and InterSystems IRIS Interoperability. 
            Your job is to analyze OpenAPI documents and create a detailed plan for how they should be 
            implemented as IRIS Interoperability components.""",
            verbose=False,
            allow_delegation=False,
            tools=[analyze_openapi_tool],
            llm=get_facilis_llm()
        )

   analysis_task = Task(
        description="""Analyze the OpenAPI specification and plan the necessary IRIS Interoperability components. 
        Include a list of all components that should be in the Production class.""",
        agent=analyzer,
        expected_output="A detailed analysis of OpenAPI spec and plan for IRIS components, including Production components list",
        input={
            "openapi_spec": openApiSpec,
            "production_name": "${production_name}" 
        }
    )

Next up, the Business Services (BS) and Business Operations (BO) Agents take over. They generate the Business Services and Business Operations based on the OpenAPI endpoints. They use a handy tool called MessageClassTool to generate the perfect message classes, ensuring the communication.


    def create_bs_generator_agent():
        return Agent(
            role="IRIS Production and Business Service Generator",
            goal="Generate properly formatted IRIS Production and Business Service classes from OpenAPI specifications",
            backstory="""You are an experienced InterSystems IRIS developer specializing in Interoperability Productions.
            Your expertise is in creating Business Services and Productions that can receive and process incoming requests based on
            API specifications.""",
            verbose=False,
            allow_delegation=True,
            tools=[generate_production_class_tool, generate_business_service_tool],
            llm=get_facilis_llm()
        )

    def create_bo_generator_agent():
        return Agent(
            role="IRIS Business Operation Generator",
            goal="Generate properly formatted IRIS Business Operation classes from OpenAPI specifications",
            backstory="""You are an experienced InterSystems IRIS developer specializing in Interoperability Productions.
            Your expertise is in creating Business Operations that can send requests to external systems
            based on API specifications.""",
            verbose=False,
            allow_delegation=True,
            tools=[generate_business_operation_tool, generate_message_class_tool],
            llm=get_facilis_llm()
        )

    bs_generation_task = Task(
        description="Generate Business Service classes based on the OpenAPI endpoints",
        agent=bs_generator,
        expected_output="IRIS Business Service class definitions",
        context=[analysis_task]
    )

    bo_generation_task = Task(
        description="Generate Business Operation classes based on the OpenAPI endpoints",
        agent=bo_generator,
        expected_output="IRIS Business Operation class definitions",
        context=[analysis_task]
    )

    class GenerateMessageClassTool(BaseTool):
        name: str = "generate_message_class"
        description: str = "Generate an IRIS Message class"
        input_schema: Type[BaseModel] = GenerateMessageClassToolInput

        def _run(self, message_name: str, schema_info: Union[str, Dict[str, Any]]) -> str:
            writer = IRISClassWriter()
            try:
                if isinstance(schema_info, str):
                    try:
                        schema_dict = json.loads(schema_info)
                    except json.JSONDecodeError:
                        return "Error: Invalid JSON format for schema info"
                else:
                    schema_dict = schema_info

                class_content = writer.write_message_class(message_name, schema_dict)
                # Store the generated class
                writer.generated_classes[f"MSG.{message_name}"] = class_content
                return class_content
            except Exception as e:
                return f"Error generating message class: {str(e)}"

Once BS and BO have done their thing, it's time for the Production Agent to shine! This agent pulls everything together to create a cohesive production environment.

After everything is set up, next in line is the Validation Agent. This one makes comes in for a final checkup, making sure each Iris class is ok.

Then we have the Export Agent and the Collection Agent. The Export Agent generates the .cls files while the Collection Agent gathers all the file names. Everything gets passed along to the importer, which compiles everything into InterSystems Iris.


    def create_exporter_agent():
        return Agent(
            role="IRIS Class Exporter",
            goal="Export and validate IRIS class definitions to proper .cls files",
            backstory="""You are an InterSystems IRIS deployment specialist. Your job is to ensure 
            that generated IRIS class definitions are properly exported as valid .cls files that 
            can be directly imported into an IRIS environment.""",
            verbose=False,
            allow_delegation=False,
            tools=[export_iris_classes_tool, validate_iris_classes_tool],
            llm=get_facilis_llm()
        )
        
    def create_collector_agent():
        return Agent(
            role="IRIS Class Collector",
            goal="Collect all generated IRIS class files into a JSON collection",
            backstory="""You are a file system specialist responsible for gathering and 
            organizing generated IRIS class files into a structured collection.""",
            verbose=False,
            allow_delegation=False,
            tools=[CollectGeneratedFilesTool()],
            llm=get_facilis_llm()
        )

    export_task = Task(
        description="Export all generated IRIS classes as valid .cls files",
        agent=exporter,
        expected_output="Valid IRIS .cls files saved to output directory",
        context=[bs_generation_task, bo_generation_task],
        input={
            "output_dir": "/home/irisowner/dev/output/iris_classes"  # Optional
        }
    )

    collection_task = Task(
        description="Collect all generated IRIS class files into a JSON collection",
        agent=collector,
        expected_output="JSON collection of all generated .cls files",
        context=[export_task, validate_task],
        input={
            "directory": "./output/iris_classes"
        }
    )

Limitations and Challenges

Our project started as an exciting experiment, where my fellow musketeers and I aimed to create a fully automated tool using agents. It was a wild ride! Our main focus was on REST API integrations. It's always a joy to get a task with an OpenAPI specification to integrate; however, legacy systems can be a whole different story. We thought automating these tasks could be incredibly useful. But every adventure has its twists: One of the biggest challenges was instructing the AI to convert OpenAPI to Iris Interoperability. We started with openAI GPT3.5-turbo model, on initial tests proved difficult with debugging and preventing breaks. Switching to Anthropic Claude 3.7 Sonnet showed better results for the Generator group but not so much for the Planners... This led us to split our environment configurations, using different LLM providers for flexibility. We used GPT3.5-turbo for planning and Claude sonnet for generation, great combo! This combination worked well but we did encounter issues with hallucinations. Moving to GT4o improved results, yet we still faced hallucinations creating Iris classes and sometimes unnecessary OpenAPI specifications like the renowned Pet Store OpenAPI example. we had a blast learning along the way, and I'm super excited about the amazing future in this field with countless possibilities!

0
1 131
Announcement Anastasia Dyubaylo · Mar 10, 2025

Hi Community!

It's time to celebrate our 25 fellow members who took part in the latest InterSystems Technical Article Contest and wrote

🌟 38 AMAZING ARTICLES 🌟

The competition was filled with outstanding articles, each showcasing innovation and expertise. With so many high-quality submissions, selecting the best was no easy task for the judges.

Let's meet the winners and look at their articles:

21
0 476
Announcement Anastasia Dyubaylo · Feb 27, 2025

Hey Community,

It's time for the first programming contest of the year, and there's a surprise so read on! Please welcome:

🏆 InterSystems AI Programming Contest: Vector Search, GenAI, and AI Agents 🏆

Duration: March 17 - April 6, 2025

Prize pool: $12,000 + a chance to be invited to the GlobalSummit2025!


10
0 607
Announcement Evgeny Shvarov · Mar 17, 2025

Hi Developers!

Here're the technology bonuses for the InterSystems AI Programming Contest: Vector Search, GenAI and AI Agents that will give you extra points in the voting:

  • Agent AI solution - 5
  • Vector Search usage - 4
  • Embedded Python - 3
  • LLM AI or LangChain usage: Chat GPT, Bard, and others - 3
  • IntegratedML usage - 3
  • Docker container usage - 2 
  • ZPM Package deployment - 2
  • Online Demo - 2
  • Implement InterSystems Community Idea - 4
  • Find a bug in Vector Search, or Integrated ML, or Embedded Python - 2
  • First Article on Developer Community - 2
  • Second Article On DC - 1
  • First Time Contribution - 3
  • Video on YouTube - 3
  • Suggest a new idea - 1

See the details below.<--break-><--break->

0
0 152
Announcement Irène Mykhailova · Aug 19, 2024

Hi Community!

We are excited to announce the first French technical article writing contest!

✍️ Technical Article Contest ✍️

This is the perfect opportunity for all InterSystems technology enthusiasts to share their knowledge and showcase their writing skills. No matter your experience level, everyone is welcome to participate. Articles can cover a wide range of technical topics related to InterSystems products or services. So, let your creativity and expertise run wild!

📅 Contest period: September 2-29, 2024

🎁 Gifts for all: a special gift is prepared for each participant!

🏅 Prizes for the authors of the best articles

3
0 170
Article Dmitry Maslennikov · Mar 2, 2025 5m read

After so many years of waiting, we finally got an official driver available on Pypi

Additionally, found JDBC driver finally available on Maven already for 3 months,  and .Net driver on Nuget more than a month.

 As an author of so many implementations of IRIS support for various Python libraries, I wanted to check it. Implementation of DB-API means that it should be replaceable and at least functions defined in the standard. The only difference should be in SQL.

And the beauty of using already existing libraries, that they already implemented other databases by using DB-API standard, and these libraries already expect how driver should work.

I decided to test InterSystems official driver by implementing its support in SQLAlchemy-iris library.

7
4 284
Article Andre Larsen Barbosa · Feb 11, 2025 2m read

   

Unlike the movie mentioned in the image (for those who don't know, Matrix, 1999), the choice between Dynamic SQL and Embedded SQL is not a choice between truth and fantasy, but it is still a decision to be made.Below, I will try to make your choice easier. 

If your need is interactions between the client and the application (and consequently the database), Dynamic SQL may be more appropriate, as it "adapts" very easily to these query changes.However, this dynamism has a cost: with each new query, it is remodeled, which can have a higher cost to execute.Below is a simple example of a Python code snippet. 

16
5 536
Article Julio Esquerdo · Feb 10, 2025 8m read

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris

Part 3 – REST and Interoperability

Now that we have finished the configuration of the SQL Gateway and we have been able to access the data from the external database via python, and we have set up our vectorized base, we can perform some queries. For this in this part of the article we will use an application developed with CSP, HTML and Javascript that will access an integration in Iris, which then performs the search for data similarity, sends it to LLM and finally returns the generated SQL. The CSP page calls an API in Iris that receives the data to be used in the query, calling the integration. For more information about REST in the Iris see the documentation available at https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cl…

1
0 115
Article Julio Esquerdo · Feb 18, 2025 22m read

REST API with Swagger in InterSystems IRIS

Hello

The HTTP protocol allows you to obtain resources, such as HTML documents. It is the basis of any data exchange on the Web and a client-server protocol, meaning that requests are initiated by the recipient, usually a Web browser.

REST APIs take advantage of this protocol to exchange messages between client and server. This makes REST APIs fast, lightweight, and flexible. REST APIs use the HTTP verbs GET, POST, PUT, DELETE, and others to indicate the actions they want to perform.

0
4 333
Article Julio Esquerdo · Feb 11, 2025 6m read

Using Flask, REST API, and IAM with InterSystems IRIS

Part 2 – Flask App

Flask is a web development microframework written in Python. It is known for being simple, flexible, and enabling rapid application development.

Installing Flask is very simple. Once you have python installed correctly on your operating system, we need to install the flask library with the pip command. For REST API consumption, it is advisable to use the requests library. The following link provides a guide to installing flask: https://flask.palletsprojects.com/en/stable/installation/

0
0 109
Article Julio Esquerdo · Feb 11, 2025 9m read

Using Flask, REST API, and IAM with InterSystems IRIS

Part 1 - REST API

Hello

In this article we will see the implementation of a REST API to perform the maintenance of a CRUD, using Flask and IAM.

In this first part of the article we will see the construction and publication of the REST API in Iris.

First, let's create our persistent class to store the data. To do this, we go to Iris and create our class:

0
0 189
Article Julio Esquerdo · Feb 10, 2025 7m read

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris

Part 2 – Python and Vector Search

Since we have access to the data from our external table, we can use everything that Iris has to offer with this data. Let's, for example, read the data from our external table and generate a polynomial regression with it.

For more information on using python with Iris, see the documentation available at https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=AFL_epython

Let's now consume the data from the external database to calculate a polynomial regression. To do this, we will use a python code to run a SQL that will read our MySQL table and turn it into a pandas dataframe:

0
0 120
Article Julio Esquerdo · Feb 10, 2025 4m read

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris

Part 1 - SQL Gateway

Hello

In this article we will look at the use of SQL Gateway in Iris. SQL Gateway allows Iris to have access to tables from other (external) database via ODBC or JDBC. We can access Tables or Views from various databases, such as Oracle, PostgreSQL, SQL Server, MySQL and others.

0
0 151
Article Dmitry Maslennikov · Feb 5, 2025 8m read

From the previous article, we identified some issues when working with JSON in SQL.

IRIS offers a dedicated feature for handling JSON documents, called DocDB.

InterSystems IRIS® data platform DocDB is a facility for storing and retrieving database data. It is compatible with, but separate from, traditional SQL table and field (class and property) data storage and retrieval. It is based on JSON (JavaScript Object Notation) which provides support for web-based data exchange. InterSystems IRIS provides support for developing DocDB databases and applications in REST and in ObjectScript, as well as providing SQL support for creating or querying DocDB data.

By its nature, InterSystems IRIS Document Database is a schema-less data structure. That means that each document has its own structure, which may differ from other documents in the same database. This has several benefits when compared with SQL, which requires a pre-defined data structure.

The word “document” is used here as a specific industry-wide technical term, as a dynamic data storage structure. “Document”, as used in DocDB, should not be confused with a text document, or with documentation.

Let's explore how DocDB can help store JSON in the database and integrate it into projects that rely solely on xDBC protocols.

1
3 241
Announcement Anastasia Dyubaylo · Nov 22, 2024

Hi Developers,

🎄 Christmas cheer is in the air, and we decided to try something new for the last programming contest of the year. Welcome the 

🏆 Bringing Ideas to Reality Contest 🏆

Submit an application that implements an idea from the InterSystems Ideas Portal that has statuses Community Opportunity or Future Consideration and requires doing the actual programming 😉

Duration: December 2 - 22, 2024

Prize pool: $14,000

 

14
0 910
Announcement Evgeny Shvarov · Dec 12, 2024

Here are the technology bonuses for the InterSystems "Bringing Ideas to Reality" Contest 2024 that will give you extra points in the voting:

  • IRIS Vector Search usage -3
  • Embedded Python usage -3
  • InterSystems Interoperability - 3
  • InterSystems IRIS BI - 3
  • VSCode Plugin - 3
  • FHIR Tools - 3
  • Docker container usage -2 
  • ZPM Package Deployment - 2
  • Online Demo -2 
  • Find a bug in Embedded Python - 2
  • Code Quality pass - 1
  • Article on Developer Community - 2
  • The second article on Developer Community - 1
  • Video on YouTube - 3
  • YouTube Short - 1
  • First Time Contribution - 3

See the details below.<--break->

3
0 174