#Mapping

0 Followers · 105 Posts

Mapping of classes, routines, globals to Namespaces.

Article Andreas Schneider · Apr 22, 2025 4m read

When using standard SQL or the object layer in InterSystems IRIS, metadata consistency is usually maintained through built-in validation and type enforcement. However, legacy systems that bypass these layers—directly accessing globals—can introduce subtle and serious inconsistencies.

2
0 150
Article Sanjib Pandey · Oct 17, 2025 13m read

Overview

This web interface is designed to facilitate the management of Data Lookup Tables via a user-friendly web page. It is particularly useful when your lookup table values are large, dynamic, and frequently changing. By granting end-users controlled access to this web interface (read, write, and delete permissions limited to this page), they can efficiently manage lookup table data according to their needs.

The data managed through this interface can be seamlessly utilized in HealthConnect rules or data transformations, eliminating the need for constant manual monitoring and management of the lookup tables and thereby saving significant time.

Note:
If the standard Data Lookup Table does not meet your mapping requirements, you can create a custom table and adapt this web interface along with its supporting class with minimal modifications. Sample class code is available upon request.

0
1 59
Article Andre Larsen Barbosa · Jul 30, 2025 3m read

Well... It's time for testing. We know that often, it's already over. So, what now? Can I improve the quality of my development?

The answer is: YES. Yes, you can. The Toolqa tool aims to do just that. It's a facilitator. What's its goal?

To ensure that APIs meet pre-established business requirements, while also protecting against <sarcasm>  future failed attempts </sarcasm>  to destroy your application, website, app, or anything else that uses your API.

Now you might be wondering, how does this happen? Where's the magic?

0
1 67
Article Chi Nguyen-Rettig · Jun 1, 2025 3m read

IRIS supports CCDA and FHIR transformations out-of-the-box, yet the ability to access and view those features requires considerable setup time and product knowledge. The IRIS Interop DevTools application was designed to bridge that gap, allowing implementers to immediately jump in and view the built-in transformation capabilities of the product. 

In addition to the IRIS XML, XPath, and CCDA Transformation environment, the Interop DevTools package now provides:

0
1 104
Question Bransen Smith · Oct 1, 2024

I have the class ConfigUtils.ConfigSettingsTable, which is a persistent object.  I know I need to map packages from the original namespace. In this case, I have mapped ConfigUtils.ConfigSettingsTable from the originating namespace (IRISTST database) across all other namespaces.

With this, I am able to see the table ConfigUtils.InstanceSettings in SQL Explorer in each namespace, but the same data is not shared across environments. For example, in the MAINTENANCE namespace, I can see the table, but I don't see the same information that I see in the table in the original IRISTST namespace.

8
0 200
Question Mark OReilly · Feb 4, 2025

When removing a segment in DTL for hl7 using foreach the segment doesn't actually get removed and leaves a blank segment. 

i.e. 

MSH|^~\&|SendingSystem|ReceivingSystem|202301241000||ADT^A01|12345|P|2.4
EVN|S|202301241000
PID|1|MRN12345|1^^^^KEEP~2^^^^Remove~3^^^^Keep|M|19800101|  
PD1|PatientAddress^Street^City^State^Zip
PV1|I|INPATIENT|BED123|DoctorID|202301241000|202301241000

This blank segment to be removed

2
0 169
Article Anthony Master · Dec 12, 2024 2m read

Like many others probably find themselves, we were stuck doing live data mapping in our Interface Engine that we really didn't want to do, but had no good alternative choice. We want to only keep mappings for as long as possibly needed and then purge expired rows based upon a TTL value. We actually had 4 use cases for it ourselves before we built this. Use cases:

0
0 211
Article Anthony Master · Oct 17, 2024 2m read

I was working on a DTL but kept getting ERROR #5002... MAXSTRING errors. The problem was that most of the DTL GUI action steps only support the string data type when working with the segments. A %String has a limit of 3,641,144 characters and my OBX5.1 was 5,242,952 characters long as the example provided. Of course PACS admin stated ultra high quality up to and including 4K resolution files were needed, so we could not get the vendor to compress or reformat these files to compressed jpg or something similar.

Initially this vendor sends a 2.3 ORU^R01 and our EHR (Epic) is expecting a 2.3 MDM^T02. Furthermore, we needed the following transformations:

  1. The embedded image was sent in OBX-5.1, and we needed it moved to OBX-5.5
  2. The image format was sent in OBX-6 and we needed it in OBX-5.3 & 4
  3. Needed to create TXA segment
  4. Support a set 25 OBX segments that may be completely empty (>25 x 5Mb = 125Mb+ Message sizes, yikes!)

Example received message (replace ... with 5+ Mb embedded data):

MSH|^~\&|VENDOR||||20241017125335||ORU^R01|1|P|2.3|
PID|||203921||LAST^FIRST^^^||19720706|M||||||||||100001|
PV1||X||||||||GI6|||||||||100001|
ORC|RE||21||SC||1|||||||||||
OBR|1||21|21^VENDOR IMAGES|||20241017123056|||||||||1001^GASTROENTEROLOGY^PHYSICIAN|||||Y||||F||1|
OBX|1|PR|100001|ch1_image_001.bmp|...^^^^^^^|BMP|||||F|
OBX|2|PR|100001|ch1_image_003.bmp|...|BMP|||||F|
OBX|3|PR|100001|ch1_video_01thumbnail.bmp|...|BMP|||||F|
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||
OBX||||||||||||

My normal tools and testing was not up to par with these really large messages. When I used this example replacing the data with the ... of course the normal DTL drag and drop GUI and testing would play nice, but plug in the real data, and it all crumbled.

Eventually I found that I had to use a code block with ObjectScript using the %GlobalCharacterStream data types to work with the large messages correctly.

Sharing my final DTL class for anyone who might come after me and find this helpful

Class OrdRes.VendorMDM Extends Ens.DataTransformDTL [ DependsOn = EnsLib.HL7.Message ]
{

Parameter IGNOREMISSINGSOURCE = 1;

Parameter REPORTERRORS = 1;

Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;

XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.3:ORU_R01' targetDocType='2.5:MDM_T02' create='new' language='objectscript' >
<assign value='source.{MSH}' property='target.{MSH}' action='set' />
<assign value='"MDM"' property='target.{MSH:MessageType.MessageCode}' action='set' />
<assign value='"T02"' property='target.{MSH:MessageType.TriggerEvent}' action='set' />
<assign value='"2.5"' property='target.{MSH:VersionID.VersionID}' action='set' />
<assign value='source.{MSH:DateTimeofMessage}' property='target.{EVN:2}' action='set' />
<assign value='source.{PIDgrpgrp().PIDgrp.PID}' property='target.{PID}' action='set' />
<assign value='source.{PIDgrpgrp().PIDgrp.PV1grp.PV1}' property='target.{PV1}' action='set' />
<assign value='source.{PIDgrpgrp().ORCgrp().ORC}' property='target.{ORCgrp().ORC}' action='set' />
<assign value='source.{PIDgrpgrp().ORCgrp().OBR}' property='target.{ORCgrp().OBR}' action='set' />
<assign value='source.{PIDgrpgrp().ORCgrp().NTE()}' property='target.{ORCgrp().NTE()}' action='set' />
<assign value='"Endoscopy Image"' property='target.{TXA:DocumentType}' action='set' />
<assign value='"AU"' property='target.{TXA:DocumentCompletionStatus}' action='set' />
<assign value='"AV"' property='target.{TXA:DocumentAvailabilityStatus}' action='set' />
<assign value='source.{PID:18}' property='target.{TXA:12.3}' action='set' />
<code>
<![CDATA[
 set OBXCount=source.GetValueAt("PIDgrpgrp(1).ORCgrp(1).OBXgrp(*)")
 For k1 = 1:1:OBXCount
 {
   // if OBX-1 is empty then it is assumed the rest of the segment will be empty too, so disregard it.
   If source.GetValueAt("PIDgrpgrp(1).ORCgrp(1).OBXgrp("_k1_").OBX:SetIDOBX") '= ""
   {
     // create new stream to read source OBX
     set srcOBXStream=##class(%GlobalCharacterStream).%New()
     // get stream data from source OBX
     set tSC=source.GetFieldStreamRaw(srcOBXStream,"PIDgrpgrp(1).ORCgrp(1).OBXgrp("_k1_").OBX")
     // get the positions of needed delimitters:
     set p1=srcOBXStream.FindAt(1,"|")    // 0>p1="OBX"
     set p2=srcOBXStream.FindAt(p1+1,"|") // p1>p2=OBX-1
     set p3=srcOBXStream.FindAt(p2+1,"|") // p2>p3=OBX-2
     set p4=srcOBXStream.FindAt(p3+1,"|") // p3>p4=OBX-3
     set p5=srcOBXStream.FindAt(p4+1,"|") // p4>p5=OBX-4
     set p6=srcOBXStream.FindAt(p5+1,"^") // p5>p6=OBX-5.1
     set p7=srcOBXStream.FindAt(p6+1,"|") // p6>p7=OBX-5.2 -> OBX 5.*
     // if no OBX-5.2 then there will not be the `^` and p6 and p7 will be `-1`
     // when that is the case, find p7 starting at `p5+1` and make p6 = p7
     if (p6 < 0) {
       set p7=srcOBXStream.FindAt(p5+1,"|") // p5>p7=OBX-5
       set p6=p7
     }
     set p8=srcOBXStream.FindAt(p7+1,"|") // p7>p8=OBX-6
     set tStream=##class(%GlobalCharacterStream).%New()

     // renumber OBX-1 to OBX 
     set tSC=tStream.Write("OBX|"_k1_"|")
     
     // set OBX2-2 to "ED"
     set tSC=tStream.Write("ED|")
     
     // copy source OBX-3 to target OBX-3
     set tSC=srcOBXStream.MoveTo(p3+1)
     set tSC=tStream.Write(srcOBXStream.Read(p4-p3-1))
     set tSC=tStream.Write("|")
     
     // copy source OBX-4 to target OBX-4
     set tSC=srcOBXStream.MoveTo(p4+1)
     set tSC=tStream.Write(srcOBXStream.Read(p5-p4-1))
     
     // copy source OBX-6 to OBX-5.3 & OBX-5.4
     set tSC=srcOBXStream.MoveTo(p7+1)
     set docType=srcOBXStream.Read(p8-p7-1)
     set tSC=tStream.Write("|^^"_docType_"^"_docType_"^")
     
     // copy source OBX-5.1 to target OBX-5.5
     // can only set up to 3,641,144 chars at once, so do while loop...
     set startPos=p5+1
     set remain=p6-p5-1
     // characters to read/write in each loop, max is 3,641,144 since .Write limit is a %String
     set charLimit=3000000
     while remain > 0 {
       set tSC=srcOBXStream.MoveTo(startPos)
       set toRead = charLimit
       if toRead > remain {
         set toRead=remain
       }
       set tSC=tStream.Write(srcOBXStream.Read(toRead))
       set remain=remain-toRead
       set startPos=startPos+toRead
     }
     set tSC=tStream.Write("|")

     set obxSegment=##class(EnsLib.HL7.Segment).%New()
     set obxSegment.SegType="2.5:OBX"
     set tSC=obxSegment.StoreRawDataStream(tStream)
     set tSC=target.setSegmentByPath(obxSegment,"OBXgrp("_k1_").OBX")
   }
 }
]]></code>
</transform>
}

}

For developing and testing this, I used the VSCode Plugins for InterSystems because the integrating testing tools could not handle the message size.

I will also add, that getting HL7 over HTTPS to Epic's InterConnect also involved creating a custom HTTP class and sending the custom Content-Type x-application/hl7-v2+er7

7
1 392
Question Ben Spead · May 29, 2024

Let's say I have an InterSystems IRIS instance with 6 Namespaces:

  • Foo1
  • Foo2
  • Foo3
  • Foo4
  • Foo5
  • Bar

And the number of Foo# namespaces can increase at any time for a number of reasons.  I need to ensure that they all have identical configuration globals stored in a DB called CONFIG, so I do the following in my configuration file:

[Map.%ALL] 
Global_SYS=%DEFAULTDB 
Global_SYS("CommonConfig")=CONFIG 
Global_SYS("CommonOtherSettings")=CONFIG 
Global_SourceControl=CONFIG 

This will ensure that all Namespace see the exact same settings when they reference ^SYS("CommonConfig") or ^SourceControl, etc.

Now...

9
0 328
Question Colin Nagle · Apr 11, 2024

Hello all,

If we have a global structured something like this:-

  1. ^test(1)="Dave^Engineer"
  2. ^test(2)="Bethan^Manager"
  3. ^test(2,"transferred")="2024-01-04"
  4. ^test(3)="James^Administrator"
  5. ^test(4)="Sophia^Marketing Executive"
  6. ^test(4,"transferred")="2023-11-20"

It is fairly straightforward to map, and from a reading perspective everything works fine, but when inserting/saving the "transferred" node is created even if the 'transferred' field is not set. Is there a way to populate the node only if the field mapped to it is set?

Thanks

Colin

2
0 175
Question Julian Matthews · Apr 8, 2024

Hi everyone.

Is there a sensible approach to having a lookup table in Namespace A, and then accessing this from Namespaces B, C, D (etc)?

I'm trying to avoid creating a Global mapping of the lookup table global (^Ens.LookupTable) as I fear that it would then link all other lookups in that global and lead to some unexpected behaviour, but would be open to trying something in this realm if it's the best option.

Another approach I have considered is creating a custom lookup function that is run from the secondary namespaces that does some namespace hopping, but it feels messy. Something like:

4
0 348
Question Richard Rayburn · Mar 22, 2024

In cache studio there are features, dialog boxes, that help map data from a global to class properties.
I have used %CacheSQLStorage quit a bit, or have in the past, to map globals to classes.

I haven't been able to find a similar feature in VisualStudio.
Do I need to upgrade to IRIS to be able to use VisualStudio to map global properties to classes?

Thanks for your time,
Richard

2
0 183
Article Brendan Bannon · Dec 27, 2016 8m read

The Art of Mapping Globals to Classes (4 of 3)

The forth in the trilogy, anyone a Hitchhikers Guide to the Galaxy fan?

If you are looking to breathe new life into an old MUMPS application follow these steps to map your globals to classes and expose all that beautiful data to Objects and SQL.

If the above does not sound familiar to you please start at the beginning with the following:

The Art of Mapping Globals to Classes (1 of 3)

The Art of Mapping Globals to Classes (2 of 3)

The Art of Mapping Globals to Classes (3 of 3)

7
0 2281
Article Brendan Bannon · Dec 27, 2016 8m read

Mapping Examples

Clearly if you have a fourth article in the trilogy you need to go for the money grab and write the fifth, so here it is!

Note:  Many years ago Dan Shusman told me that mapping globals is an art form.  There is no right or wrong way of doing it.  The way you interpret the data leads you to the type of mapping you do.  As always there is more than one way to get to a final answer.  As you look through my samples you will see there are some examples that map the same type of data in different ways.

15
1 2713
Question Andy Khemraj · Dec 1, 2023

I am trying to replace OBX.2 value of ED to PDF

Here is what I have in the funcion wizard

..ReplaceStr(source.{OBRgrp().OBXgrp().OBX():ValueType},"ED","PDF")

Inbound Message

OBX|5|ED|PDFReport^PDFReport||PDF^Image^PDF^Base64^JVBERi0xLjMNCjEgMCBvYmoNCjw8DQovVHlwZSAvQ2F0YWxvZw0KL1BhZ2VzIDQgMCBSDQovT3V0bGluZXMgMiAwIFI

Desired Outbound Message

OBX|5|PDF|PDFReport^PDFReport||PDF^Image^PDF^Base64^JVBERi0xLjMNCjEgMCBvYmoNCjw8DQovVHlwZSAvQ2F0YWxvZw0KL1BhZ2VzIDQgMCBSDQovT3V0bGluZXMgMiAwIFI

1
0 285
Question Sylvain Guilbaud · Sep 8, 2023

Hello,

I have a global whose structure is multi-level and I am trying through a class and a SQL query to display a table which includes all the values ​​and levels.

^AFO("Site","Ville")="66722,3743"
^AFO("Site","Ville","111BB","OBT")=",MMM,XXX,"
^AFO("Site","Ville","111OW","OBT")=",XXX,MMM,"
^AFO("Site","Ville","AANVRBIBS","zzz")    =    "1^^1"
^AFO("Site","Ville","AANVRBIBS","zzz","*","dut")    =    "*afhalen waar gevonden"
^AFO("Site","Ville","AANVRBIBS","zzz","*","eng")    =    "*Pickup where found"
^AFO("Site","Ville","AANVRBIBS","zzz","*","fre")    =    "*Lieu où trouvé"

2
0 588
Question John Bradshaw · Jul 12, 2023

I found the thread that discusses object mapping, in particular mapping a common global among more than one namespace. The example that is given is a simple one when it's ^global(sub1, ^global(sub2, etc. However I'm having trouble getting this to compile/work when the global has a fixed subscript amongst variable ones.

I have this global in namespaces LAB and ARK in the following format:
^CB(1,sub1)=....

^CB(1,sub2)=...

^CB(1,sub3)=...

Here is what I have for this. In it's current state it throws tons of errors:

4
0 400
Question Javier Gonzalez · Jun 1, 2017

I'm doing a REST service. A method has as body parameter a JSON corresponding to a class A.

In my production I have class A so that I retrieve the parameters using a dynamic object, such that:

Set body = ##class(%DynamicObject).%FromJSON(%request.Content)
Set myObjectA = ##class(A).%New()
Set myObjectA.Id = body.Id
Set myObjectA.Name = body.Name
Set myObjectA.Date = body.Date
Set myObjectA.Salary = body.Salary

I would like to know if I can avoid doing the manual mapping, doing a casting, since I am sure that FromJSON will return a class A. Something like this:

9
1 1019
Question Ties Voskamp · Aug 3, 2023

Hi,

Using Interoperability, I can't figure out how to create separate XML's files from a CSV-file using the GUI-features Record Maps/Complex Record Mapper -> Data Transformations. I'm familiar with reading/writing the files using File Service/Operation, but don't understand the processing-steps. The preferred method by my colleagues is to do this without any Objectscript or Embedded Python coding, but if this can only be done by some coding that's fine as well.

See example below. Any help is appreciated!

Kind regards, Ties Voskamp

Example CSV:

PONumber;PODate;ArticleID;ArticleName;MeasureOfUnit;UnitPrice;UnitsOrdered
52001;2023-02-01;"AP12345";"Pencil Black 5mm";10;5.50;1
52001;2023-02-01;"AP12350";"Pencil Red 15mm";10;7.50;2
52003;2023-03-01;"AP12345";"Pencil Black 5mm";10;5.50;5
52003;2023-03-01;"AP12350";"Pencil Red 15mm";10;7.50;5
52003;2023-03-01;"AP12346";"Pencil Blue 5mm";10;5.50;3
52003;2023-03-01;"AP12347";"Pencil Green 5mm";10;5.50;2
52014;2023-04-21;"AP12345";"Pencil Black 5mm";10;5.50;5
52014;2023-04-21;"AP12350";"Pencil Red 15mm";10;7.50;5
52014;2023-04-21;"AP12346";"Pencil Blue 5mm";10;5.50;8

I need to create separate XML-files (one per unique PONumber) with the orderlines belonging to that PONumber. Example output: PO52001.xml

<PurchaseOrder PONumber="52001" PODate="2023-02-01">
    <POLine>
        <ArticleID>AP12345</ArticleID>
        <ArticleName>Pencil Black 5mm</ArticleName>
        <MeasureOfUnit>10</MeasureOfUnit>
        <UnitPrice>5.50</UnitPrice>
        <UnitsOrdered>1</UnitsOrdered>
    </POLine>
    <POLine>
        <ArticleID>AP12350</ArticleID>
        <ArticleName>Pencil Red 15mm</ArticleName>
        <MeasureOfUnit>10</MeasureOfUnit>
        <UnitPrice>7.50</UnitPrice>
        <UnitsOrdered>2</UnitsOrdered>
    </POLine>
</PurchaseOrder>
2
0 301
Article Brendan Bannon · Aug 29, 2016 7m read

The Art of Mapping Globals to Classes 1 of 3

Looking to breathe new life into an old MUMPS application?  Follow these steps to map your existing globals to classes and expose all that beautiful data to Objects and SQL.

By following the simple steps in this article and the next two you will be able to map all but the craziest globals to Caché classes.  For the crazy ones I will put up a zip file of different mappings I have collected over the years.  This is NOT for new data; if you don’t already have existing global please just use the default storage.

26
12 6089
Question Dmitrij Vladimirov · Jan 10, 2023

I need to split existing tables from database and put some parts of them into a new namespace. I dont know where to start, other than the installer.cls file. If you can provide clear instructions i would be greatful.
Example:
I have NAMESPACE=NEWTEST and DB

The i need to take TABLES from that DB pull specific data from them and bind it to NEWTEST

5
0 355
Article Eduard Lebedyuk · Feb 19, 2016 12m read

Suppose you have developed your own app with InterSystems technologies stack and now want to perform multiple deployments on the customers' side. During the development process you've composed a detailed installation guide for your application, because you need to not only import classes, but also fine-tune the environment according to your needs.
To address this specific task, InterSystems has created a special tool called %Installer. Read on to find out how to use it.

9
4 4601
Article Lorenzo Scalese · Apr 4, 2022 6m read

Hi Community,

This article describes the small ZPM module global-archiver.
The goal is to move a part of a global from a database to another database.

Why this package?

A typical use case is read-only data sequentially added to your database that you can never delete.
For example:

  • User log access to patient medical data.
  • Medical documents versioning.

Depending on the intensive usage of your application, these data could highly increase your database size.
To reduce the backup time, it could be interesting to move these data to a database dedicated to the archive and make a backup of this database only after an archive process.

How It Works?

global-archiver copy a global using the Merge command and the global database level reference.
The process needs only the global name, the last identifier, and the archive database name.
Archive database could be a remote database, the syntax ^["/db/dir/","server"]GlobalName is also supported.
After the copy, a global mapping is automatically set up with the related subscript.
See the copy method here

Let's do a simple test

ZPM Installation

zpm "install global-archiver"

Generate sample data:

Do ##class(lscalese.globalarchiver.sample.DataLog).GenerateData(10000)

Now we copy the data with a DateTime older than 30 days:

Set lastId = ##class(lscalese.globalarchiver.sample.DataLog).GetLastId(30)
Set Global = $Name(^lscalese.globalarcCA13.DataLogD)
Set sc = ##class(lscalese.globalarchiver.Copier).Copy(Global, lastId, "ARCHIVE")

Our data are copied, and we can delete archived data from the source database.
On a live system, it's strongly recommended to have a backup before cleaning.

Set sc = ##class(lscalese.globalarchiver.Cleaner).DeleteArchivedData(Global,"ARCHIVE")

Now, open the global mapping page on the management portal. You should see a mapping with a subscript for the global lscalese.globalarcCA13.DataLogD.
You can see the data using the global explorer in the management portal, select database instead of a namespace, and show the global lscalese.globalarcCA13.DataLogD in database USER and ARCHIVE.

Great, the data are moved and the benefits of global mapping avoid any changes to your application code.

Advanced usage - Mirroring and ECP

The previous test was very simple. In the real world, most often we work with the mirroring for the high availability and of course, global mapping could be a problem.
When we perform an archive process, global mapping is only set up on the primary instance.
To solve this problem a method helper NotifyBecomePrimary is available and must be called by the ZMIRROR routine.
ZMIRROR documentation link .

A sample that allows starting a containers architecture with failover members is available on this github page.

schema-01

This sample is also a way to test copy to a remote database, you can see on the schema, our archive database will be a remote database (using ECP).

git clone https://github.com/lscalese/global-archiver-sample.git
cd global-archiver-sample

Copy your "iris.key" in this directory

This sample uses a local directory as a volume to share the database file IRIS.DAT between containers. We need to set security settings to ./backup directory.

sudo chgrp irisowner ./backup

If the irisowner group and user do not exist yet on your system, create them.

sudo useradd --uid 51773 --user-group irisowner
sudo groupmod --gid 51773 irisowner

Login to Intersystems Containers Registry

Our docker-compose.yml uses references to containers.intersystems.com.
So you need to login into Intersystems Containers Registry to pull the used images.
If you don't remember your password for the docker login to ICR, open this page https://login.intersystems.com/login/SSO.UI.User.ApplicationTokens.cls and you can retrieve your docker token.

docker login -u="YourWRCLogin" -p="YourPassWord" containers.intersystems.com

Generate certificates

Mirroring and ECP communications are secured by tls\ssl, we need to generate certificates for each node.
Simply use the provided script:

# sudo is required due to usage of chown, chgrp, and chmod commands in this script.
sudo ./gen-certificates.sh

Build and run containers

#use sudo to avoid permission issue
sudo docker-compose build --no-cache
docker-compose up

Just wait a moment, there is a post-start script executed on each node (it takes a while).

Test

Open an IRIS terminal on the primary node:

docker exec -it mirror-demo-master irissession iris

Generate data:

Do ##class(lscalese.globalarchiver.sample.DataLog).GenerateData(10000)

If you have an error "class does exist", don't worry maybe the post-start script has not installed global-archiver yet, just wait a bit and retry.

Copy data older than 30 days to the ARCHIVE database:

Set lastId = ##class(lscalese.globalarchiver.sample.DataLog).GetLastId(30)
Set Global = $Name(^lscalese.globalarcCA13.DataLogD)
Set sc = ##class(lscalese.globalarchiver.Copier).Copy(Global, lastId, "ARCHIVE")

Delete archived data from the source database:

Set sc = ##class(lscalese.globalarchiver.Cleaner).DeleteArchivedData(Global,"ARCHIVE")

Check the global mapping

Great, now open the management portal on the current primary node and check global mapping for the namespace USER and compare with global mapping on the backup node.
You should notice that the global mapping related to archived data is missing on the backup node.

Perform a stop of the current primary to force the mirror-demo-backup node to become the primary.

docker stop mirror-demo-master

When the mirror-demo-backup changes to status primary, the ZMIRROR routine is performed and the global mapping for the archived data is automatically set up.

Check data access

Simply execute the following SQL query :

SELECT 
ID, AccessToPatientId, DateTime, Username
FROM lscalese_globalarchiver_sample.DataLog

Verify the Number of records, It must be 10 000 :

SELECT 
count(*)
FROM lscalese_globalarchiver_sample.DataLog

Link - Access to portals

Master: http://localhost:81/csp/sys/utilhome.csp
Failover backup member: http://localhost:82/csp/sys/utilhome.csp
ARCHIVE ECP Data server: http://localhost:83/csp/sys/utilhome.csp

5
0 350
Question Chip Gore · Aug 18, 2016

I'm VERY novice on all things "OpenAM", and beyond knowing that Caché supports working with OpenAM, I have nothing else to go on.

The documentation doesn't seem to be very deep on the nature of how this works beyond a single paragraph saying it's supported for Single Sign On (SSO).

For Caché to use this, I get that there is an environment variable (REMOTE_USER) which is set to "something", but it's not clear to me how this ends up mapping to a provisioned caché user (or LDAP provisioned user for that matter) and ultimately to the %Roles in effect and subsequent system access.

1
0 427