TDIing out loud, ok SDIing as well

Ramblings on the paradigm-shift that is TDI.

Thursday, November 7, 2013

Connector Loops and why you're gonna love them

I'm going to use an example where my AL needs to read User accounts and check if this user is a manager. If so, then it needs to search for people with the current user as manager. For each person found it will need to look up the HR information from another system.

Traditionally this would require in an Iterator in the Feed section to do the initial reading of User accounts. Then in the Data Flow would be an IF-Branch to filter out managers. Under this Branch comes a Connector in Lookup mode to find those people who report to this manager. For each person returned by the search there is another Lookup required to retrieve info from the HR system. Using the traditional approach will require a good deal of Hook scripting - both to deal with various lookp results (none found, and multiple found), as well as dealing with the final search in the HR system.

Connector Loops makes this much easier. Here is my AL to implement the logic described above:




Let's start with the innermost Connector Loop ('Get HR…') which is used instead of a stand-alone Connector in Lookup mode. As a result, I don't have to script (or even enable) the On No Match and On Multiple Found Hooks. Using a Connector Loop in Lookup mode still gives me option to code these Hooks if I choose, but even if I leave them disabled then the Loop will cycle once for each entry found. I may want to ensure that one and only one HR record was found for a managed person, but this is my decision based on the quality of the data I'm reading and the requirements of my solution.

The second point I wanted to make was that I set both Connector Loops (the 'FOR-EACH' components in the screenshot above) to Iterator mode instead of Lookup. Both modes do the same job - searching for data based on some filter - but they do it in different ways. One major difference is that Lookup mode caches the entries read, up to the number you set in the More... > Lookup Limit setting, while Iterator mode leaves the result set on the server and retrieves one at a time for each Get Next operation. Another important difference is that Lookup mode uses Link Criteria to control its search filter, whereas Iterator mode does its selection based on parameter settings. For example, an LDAP Connector uses it's Search Filter parameter, whereas a JDBC Connector will use the SQL Select parameter if it is set, and otherwise construct a 'SELECT *' statement based on the Table Name parameter. 

This last point leads me to another handy option for Connector Loops: the Connector Parameters tab.


This feature lets you use standard attribute mapping techniques to set the value of one or more parameters. In the above example I map to ldapSearchFilter to control which entries are selected by the Loop. 

Fortunately, the Search Filter is refreshed from the parameter settings whenever entries are selected, as they are at the top of this Loop for each AL cycle. However, most parameters are only updated with the values in the Connection tab once when the Connector initializes. If I needed to set a parameter that is not refreshed for selection, for example the File Path of a File Connector or connecting to a different database server, then I would need to configure the Loop to re-initialize as well as performing the selection.


The next thing I'd like to talk about is how you exit from Loops, as well as continuing to the next cycle of a Loop.

To leave a Loop you make a system.exitBranch("Loop") call in script, documented here: exitBranch. There are two variants of the exitBranch() call: one with no arguments that exits the current (innermost) Loop or Branch, and one with a String argument. If you use the predefined "Loop" argument then it exits the innermost Loop, skipping out of any Branches that might be closer in the AL tree. As the docs state, you can also exit a named Branch or Loop, as well as exiting the Data Flow section or even the entire AL.

On the other hand, if you want to continue to the next cycle of a Loop then you use the system.continueLoop() call, described here: continueLoop. Again you have two variants: one with no arguments that continues the innermost Loop, and one that lets you name the Loop to continue to.

Finally, note that the built-in looping performed by an AL - e.g. the Feed section looping iterated data into the Data Flow section - automatically empties out the Work Entry at the top of each cycle. A Connector Loop does not do this for you. If you want to handle this yourself then putting this snippet of code in the Before GetNext Hook of the Connector Loop (in Iterator mode) will do the trick:

work.removeAllAttributes();

You could use the same code in a Hook like Before Lookup for Lookup mode.

Furthermore, I often save off the work Entry prior to my Connector Loop, like this:

saveEntry = system.newEntry();
saveEntry.merge(work);

Then after the Loop completes, I can empty out the Work Entry once more and then merge back in the saveEntry Attributes.

And if you are still confused, please leave me comments and questions and I'll do my best to answer them.





Friday, November 1, 2013

My humble insights on TDI and source management

I've gotten this question a lot recently and so thought to share my thinking on the subject.

We here in the TDI team use RTC for shared projects. Due to technical difficulties getting TDI and RTC to run in the same Eclipse environment, I run them separately: TDI in my Windows image (Parallels) and RTC under OS X.

First step is to set up a workspace for RTC, which I call imaginatively enough 'workspace_RTC'. Here I copy in the project that I created locally - at least for projects where I do the initial development.

From RTC I import the project that was copied to the workspace_RTC directory. Only the AssemblyLines and Resources folders (and contents) are tagged as included in source management. Furthermore, under Resource > Properties you will only want to include the actual Property Stores designed for this solution.

Once this is done you can either delete the local copy of the project, or rename it. Then import the project in the workspace_RTC folder via the Existing Projects into Workspace option. Select the project folder and DO NOT select the option to Copy projects into workspace. That way the CE project is linked to the one in RTC.

Alongside the Project folder in workspace_RTC we usually have a second top-level folder where support files are kept and managed separately - i.e. solution deliverables. For example, property files, external scripts and stuff like externalized attribute maps. This folder also holds the 'compiled' Project Config xml. As you probably already know, every time you make a change and then Save or Run/Debug an AL in the CE, the config XML file in the Runtime-<project name> folder under your TDI workspace is updated. That's why we do NOT include the Runtime assets in the main RTC project. Otherwise all team members will constantly be (partially) updating this resource.

Instead, one person is given the task of refreshing his local project and then creating the compiled config XML file, which is then placed in the secondary folder. If this is my job, then I use the Project > Properties setting in my CE to Link to this file. That way, whenever I deliver changes, I also update the complete config XML.

After this, my day-to-day life consists of first retrieving any changes from RTC and then refreshing the project in my CE. After that I work on those items that are my responsibility and then deliver these to RTC. If others will be working on the same source files, you'll need to lock these resources until you're done with them.

Of course, if you are using one of the source management systems that can be plugged into a single Eclipse instance alongside TDI, life is a bit easier. Then you can use the Team option in the context menu for resources directly from the CE.