Quantcast
Channel: Yet Another Tridion Blog
Viewing all 215 articles
Browse latest View live

Unit Testing your TBBs

$
0
0
One of the reasons I was Messing with the Engine is testing. I wanted to write unit tests for Tridion Template Building Blocks and for that I needed the possibility to run my templates outside of the Content Manager.

Approach

The approach is to run templates from an external project (be it stand-alone application, or a testing framework of your choice) and be able to check the Package contents at different stages in the execution. Then I would simply compare some expected Package items or even the Output item to contain some expected results.

In order to run templates in Tridion CM, one would need an instance of the Engine and Package objects. The Package is quite simple, as you would simply instantiate one using Package(Engine engine) constructor. Getting an instance of Engine on the other hand was quite excruciating -- somebody in Tridion R&D put it great effort in making that class final, sealed, internal, unextendable, unwrappable, private constructors, <fill in here your preferred C# access modifier>, etc. Well, too bad for all that effort, because one way or another there is always a solution -- in my case, enter reflection... but more on that later.

Usage

I came up with my own engine implementation TestEngine and I am able to run it on a Page or Component while passing the Page Template to render with, or Component Template respectively:

TestEngine engine = newTestEngine("tcm:20-102-64", "tcm:20-707-128");
engine.Run();

In the code above, a Page is rendered with a PT. For that TestEngine creates its own Package and then, by calling Run(), it simply fires off the template rendering process. Once Run() finishes, all TBBs in the template would have executed and the Package can be inspected for getting the success/failure status of the unit test.

foreach (KeyValuePair<string, Item> pair in engine.Package.GetEntries()) {
    Item item = pair.Value;
    Console.WriteLine(string.Format("Item {0} | Type {1} | Content {2}",
            pair.Key, item.ContentType, item.GetAsString()));
}

With the current implementation, the whole template is executed, but also more fine-grained approach is possible -- where only a specified TBB would be executed.

TestEngine Implementation

But let's see the implementation of TestEngine. It is a TemplatingRenderer specialization:

public class TestEngine : TemplatingRenderer

Constructor

TemplatingRenderer contains a bunch of useful logic, which I simply wanted to re-use. But before being able to do so, the Engine has to be initialized, so the Package and RenderedItem members need to be assigned. Again, with Package there is no problem, but _renderedItem is not exposed. Here comes in the first hack -- set _renderedItem using reflection. I do this in the TestEngine constructor:

public TestEngine(string pageOrComponentTcmUri, string templateTcmUri) {
    _session = new Session();

    itemToRender = _session.GetObject(pageOrComponentTcmUri);
    template = _session.GetObject(templateTcmUri) as Template;

    typeof(TemplatingRenderer).GetField("_renderedItem"BindingFlags.Instance | BindingFlags.NonPublic)
        .SetValue(thisnew RenderedItem(
            new ResolvedItem(itemToRender, template),
            new RenderInstruction(_session) { RenderMode = RenderMode.PreviewDynamic }
        )
    );
}

Run

Now that I have an instance of the Engine, let's execute the template on a Page or Component. There is one public method Render which kicks off the entire execution, but it does too much for me -- I need something more fine-grained and where I can have access to the Package. This method is Engine.TransformPackage(Template, Package), which only deals with the rendering part of the process. The problem with it is its visibility - internal. Here comes hack #2 and again reflection comes to the rescue:

public void Run() {
    typeof(Engine).GetMethod("TransformPackage"BindingFlags.Instance | BindingFlags.NonPublic)
        .Invoke(thisnew object[] { Template, Package });
}

Notice the property Package that I'm passing to TransfrormPackage. This is basically the object I create and the one that I'm inspecting at the end of the template rendition.

All Together

Finally, putting it all together, this is the final class:

publicclassTestEngine : TemplatingRenderer {

    privateIdentifiableObject itemToRender;
    publicIdentifiableObject ItemToRender {
        get { return itemToRender; }
    }

    privateTemplate template;
    publicTemplate Template {
        get { return template; }
    }

    privatePackage package;
    publicPackage Package {
        get {
            if(package == null) {
                package = newPackage(this);
                if(itemToRender.Id.ItemType == ItemType.Component) {
                    Itemitem = package.CreateTridionItem(ContentType.Component, itemToRender);
                    package.PushItem(Tridion.ContentManager.Templating.Package.ComponentName, item);
                } else{
                    Itemitem = package.CreateTridionItem(ContentType.Page, itemToRender);
                    package.PushItem(Tridion.ContentManager.Templating.Package.PageName, item);
                }
            }

            returnpackage;
        }
    }

    publicTestEngine(string pageOrComponentTcmUri, string templateTcmUri) {
        _session = newSession();

        itemToRender = _session.GetObject(pageOrComponentTcmUri);
        template = _session.GetObject(templateTcmUri) asTemplate;

        typeof(TemplatingRenderer).GetField("_renderedItem", BindingFlags.Instance | BindingFlags.NonPublic)
            .SetValue(this, newRenderedItem(
                newResolvedItem(itemToRender, template),
                newRenderInstruction(_session) { RenderMode = RenderMode.PreviewDynamic }
            )
        );
    }

    publicvoid Run() {
        typeof(Engine).GetMethod("TransformPackage", BindingFlags.Instance | BindingFlags.NonPublic)
            .Invoke(this, newobject[] { Template, Package });
    }
}



TOM.Java - Now with Generics Support

$
0
0
What kept me busy lately (apart from my insane work load)? Adding 'generics' support to the TOM.Java API.

I was blogging a few weeks ago about a pet project of mine -- the Java Mediator for SDL Tridion Templating. In that post I was very enthusiastic about JNI4NET bridge between Java and .NET and namely about ProxyGen -- the tool that actually generates both Java and C# proxy classes. For what it is, ProxyGen is a great tool, but it's really a Java 1.4 tool -- namely, it does not support generics. Luckily, it is open source, so anybody can modify it...

However, the problem for me was my own learning curve. ProxyGen is in fact a code generator written in C# using CodeDom, which builds a class model (from JARs and DLLs) using reflection, and finally generates C# and Java code from this model.

I won't go into the ProxyGen details here (if you want more details, I posted more details on jni4net Google group).

Anyway, with this (partial) support for generics in TOM.Java, I am able to use Java counterparts for the main generic collections from C# (and the other way around for that matter - from Java collections to C# collections). The following type mappings are available:

Java
C#
java.util.List<E>System.Collections.Generic.IList<T>
java.util.Map<K, V>System.Collections.Generic.IDictionary<K, V>
java.util.Set<E>System.Collections.Generic.ISet<T>
java.util.Collection<E>System.Collections.Generic.ICollection<T>
java.lang.Iterable<E>System.Collections.Generic.IEnumerable<T>

The methods using generics delegate their calls to the non-generic native counterparts (as originally designed by ProxyGen). What I did was basically deal with the conversion from generic to non-generic and back. The delegate methods are suffixed with the name of the Java collection they are a generic for.

Example: Package.GetEntries() returns a C# IList of KeyValuePairs of string and Item objects.
Original signature:

IList<KeyValuePair<string, Item>> GetEntries()

ProxyGen generates a Java native proxy method with signature:

public native system.collections.IList GetEntries()

If I were to use this method I would have to use the IEnumerator iterator like this (and that does not support Java for-each loops):

for (IEnumerator enumerator = _package.GetEntries().GetEnumerator(); enumerator.MoveNext();) {
    @SuppressWarnings("rawtypes")
    KeyValuePair kvp = (KeyValuePair) enumerator.getCurrent();
    system.String key = (system.String) kvp.getKey();
    Item value = (Item) kvp.getValue();
    System.out.println(String.format("\tKey: %s\tValue: %s", key, value.GetAsString()));
}

Next to this, my enhancement to ProxyGen now generates a delegate that accepts generics

java.util.List<mitza.jni4net.wrapper.KeyValuePair<system.String, tridion.contentmanager.templating.Item>> GetEntriesList()

So the loop-above can be rewritten Java-style with for-each:

for (KeyValuePair<system.String, Item> kvp : _package.GetEntriesList()) {
    System.out.println(
        String.format("Key: %s Value: %s", kvp.getKey(), kvp.getValue().GetAsString()));
}

Other examples:

for (ComponentPresentation cp : page.getComponentPresentationsList()) {
    Component component = cp.getComponent();
    ComponentTemplate componentTemplate = cp.getComponentTemplate();
    System.out.println(String.format("CP: Component: '%s' %s + Component Template: '%s' %s",
        component.getTitle(), component.getId(), componentTemplate.getTitle(), componentTemplate.getId()));
}

for (VersionedItem item : page.GetVersionsIterable()) {
    System.out.println(String.format("Version: %s User: %s Date: %s",
        item.getVersion(), item.getRevisor().getTitle(), item.getRevisionDate()));
}


Tridion OData - What's Possible and What Not?

$
0
0
I think I should make a correction to the title, it's not really OData -- it's the SDLTridion Content Delivery Webservice, which supports the OData protocol.

There is a bit of confusion about what can be done with it and what can't, so the intention of this post is to (try) clarify somewhat its capabilities.

This post assumes you have OData installed and you have a client configured to read the API (if you don't, then you can start by reading the stuff on SDLTridionWorld). I am using a .NET client, as described in the previous link.

So what can be done with CD Webservice (aka OData)? Well, pretty much everything you can do with the CD API. However, not everything. For example you cannot really do dynamic 'broker' queries. I'll explain some details below.

Let start with the basics... In my client code, I setup the Content Delivery Webservice object using the following construct (again, I refer back to SDLTridionWorld article for creating the proxy classes for this client)

string CDWS_URL = "http://localhost:8080/odata.svc";
var service = newContentDeliveryService(newUri(CDWS_URL));

Accessing Items by ID

The following example reads Component Presentations for a given Component TcmUri, retains the first CP and then returns its content. Notice how easy it looks using Linq.

privatestringGetCPContentByUri(ContentDeliveryServiceservice, string uri) {
    stringresult = string.Empty;
    TcmUritcmUri = newTcmUri(uri);

    var cp = (from x inservice.ComponentPresentations
              wherex.PublicationId == tcmUri.PublicationId &&
                    x.ComponentId == tcmUri.ItemId
              selectx).FirstOrDefault();

    if (cp != null) {
        result = cp.PresentationContent;
    }

    returnresult;
}

Behind the scene when calling GetCPContentByUri(service, "tcm:1-36"), the proxy calls the following URL:

/odata.svc/ComponentPresentations()?$filter=(PublicationId%20eq%201)%20and%20(ComponentId%20eq%2036)&$top=1

Accessing Items by Property

Sample below retrieves a Page by its 'Url' property, then expands the linked entity 'PageContent' and finally returns its Content string. Without expanding it, the page.PageContent would be null.

privatestringGetPageContentByUrl(ContentDeliveryServiceservice, string url) {
    stringresult = string.Empty;

    var page = (from x inservice.Pages.Expand("PageContent")
                wherex.Url == url
                selectx).FirstOrDefault();

    if (page != null) {
        result = page.PageContent.Content;
    }

    returnresult;
}

Since PageContent is in a related entity, I need to expand on this property of Page in order to read the related entity using the same HTTP request. Otherwise, I would have to use more HTTP GETs and performance would not be optimal.

When executing the call GetPageContentByUrl(service, "/CwaRefImpl/system/style.css"), the following URL is called on the Webservice:

/odata.svc/Pages()?$filter=Url%20eq%20'%2FCwaRefImpl%2Fsystem%2Fstyle.css'&$top=1&$expand=PageContent

Limitation 1 - Querying by Related Item Property

It is only possible to query items by their *own* properties. Attempting to query by another item's property (even if linked to the current items collection) will result in run-time error System.Data.Services.Client.DataServiceClientException can not execute query, check filter expression

I can only speculate what happens here, but this kind of queries are supposed to be supported by OData protocol itself (in fact the Northwind samples from Microsoft do have examples on such queries). Therefore, this is clearly a limitation of Tridion's CD Webservice implementation of OData.

In the code below, I am retrieving items from the ComponentPresentations collection, but querying on linked entity Component's property Title. What I should do is query on properties of ComponentPresentation's.

privatestringGetComponentPresentationByComponentTitle(ContentDeliveryServiceservice, string componentTitle) {
    stringresult = string.Empty;

    // Limitation - this won't work... :(
    // cannot query by properties in referenced entities
    var cp = (from x inservice.ComponentPresentations
              wherex.Component.Title == componentTitle
              selectx).FirstOrDefault();

    if (cp != null) {
        result = cp.PresentationContent;
    }

    returnresult;
}

In order to still have the functionality for retrieving CPs by Component title, I need to rewrite the query. I will retrieve Components, then filter by their Title, then expand ComponentPresentations linked entity and return its PresentationContent property. A bit cumbersome, but it works.

privatestringGetComponentPresentationByComponentTitleFIX(ContentDeliveryServiceservice, string componentTitle) {
    stringresult = string.Empty;

    varcomponent = (from x inservice.Components.Expand("ComponentPresentations")
                     wherex.Title == componentTitle
                     selectx).FirstOrDefault();

    if(component != null) {
        result = component.ComponentPresentations[0].PresentationContent; //check bounds
    }

    returnresult;
}

Limitation 2 - Query on Multiple Custom Meta

That's right, you can only query on one Custom Meta at a time (KeyName="State" and StringValue="California"), which makes perfect sense. You cannot use more than one query (KeyName="State" and StringValue="California" and KeyName="Type" and StringValue="Article") because you are in fact querying on the same DB Table (CustomMeta), where KeyName and StringValue are columns. Therefore, there will never be a record that can hold at the same time different values for the same column (e.g. KeyName).

The following query will not fail, but it will return empty (default) object, in other words, null.

privatestringGetComponentByCustomMeta(ContentDeliveryServiceservice, string key1, stringvalue1, string key2, stringvalue2) {
    stringresult = string.Empty;

    varcustomMeta = (from x inservice.CustomMetas.Expand("Component")
                      where x.KeyName == key1 && x.StringValue == value1 &&
                            x.KeyName == key2 && x.StringValue == value2
                      select x).FirstOrDefault();

    if(customMeta != null) {
        result = customMeta.Component.Title;
    }

    returnresult;
}

The query above should in fact be on service.Components where Component.CustomMeta.KeyName="something", but that's not supported (see first limitation above).


Tridion Beanification - Expression Language Evaluator for JSP/JSTL Templates

$
0
0
The "beanification" I am presenting here -- which could in fact be called the "DGX's Java steroided big-brother" -- is the functionality that enables writing EL for the JSP/JSTL Tridion templates, part of the Java Mediator I am currently working on. I knew writing this piece would be fun, but I underestimated how much fun it would really be. I had a blast! :) <geek sneers here/>

EL simplifies writing rather complex expressions by providing the possibility of writing the expression just using the . and [ ] syntax. The dot (.) is used to specify 'Java Beans'-like properties on a given object, and the square brakets ([ ]) specify an indexed element on any property that returns a collection.

Example:
    ${Page.ComponentPresentations[1]}
looks up the Page object, then accesses the ComponentPresentations list property on it and finally returns the 2nd Component Presentation as denoted by [1].

The Tridion beanification, in fact an EL Evaluator, conforms closely to the Java Beans convention, with minor modifications to suit the JNI proxy methods from .NET into Java. If we wanted to evaluate ${base.propertyName}, default Java Bean standard would look up, using reflection, the methods getPropertyName() and isPropertyName() on the base object. Additionally, the Tridion EL evaluator will also lookup GetPropertyName(), IsPropertyName() and PropertyName() methods on the base object, that would be generated from .NET methods or properties.

Special handling is done for items in the Package and Engine -- expressions starting with Package or Engine will consider them as the base object in the EL.

Example:
    ${Engine.RenderMode}returns the Render Mode as string by invoking engine.getRenderMode()
    ${Package.Page.PublishLocationUrl}returns the URL where the page is published by invoking engine.GetObject(package.GetByName("Page")).GetPublishLocationUrl()

For all other expressions, the evaluator will try to first look it up in the Package and if found, try to retrieve its actual TOM.Java object (from the Engine), so expressions ${Engine.Page} and ${Page} are in fact the same, as both yield the same Page TOM.Java proxy object into .NET.

Some Tridion EL examples:
    ${Page.ContextRepository.Title}
    ${Page.Publication.Title}
    ${Page.Publication.Metadata.Link.Metadata.Type.Title
from Page get Publication (ContextRepository), get Metadata Schema, field Link (Component Link) get actual linked Component, get Metadata field Type (a Category Keyword) and finally get the Keyword title;
    ${Page.Publication.Metadata.Link.Fields.Summary}linked Component, field Summary
    ${Page.Publication.Metadata.Embeddable[1].Number}Embeddable field, numeric field Number
    ${Page.Publication.Metadata.Embeddable[1].MMLink.Title}Embeddable Multimedia Link, get title
    ${Page.Publication.Metadata.Link.Title}Component Link title
 
I can even write loops now:
    <c:forEach var="cp" items="${Page.ComponentPresentationsList}">
        Component: ${cp.Component.Id} '${cp.Component.Title}'
        Component Template: ${cp.ComponentTemplate.Id} '${cp.ComponentTemplate.Title}'
    </c:forEach>
 
Notice the usage of ComponentPresentationsList instead of ComponentPresentations -- this is because the List version returns a java.util.List wrapper, so it can be used in a for each iteration.

Implementation Details

I implemented the whole logic in an javax.el.ELResolver class, which gets wired up in the Page Context's getELContext method.

publicclassTridionELResolverextends ELResolver {
 
    privatestatic ELResolver instance;
    privatestaticfinalString methodPrefixes[] = new String[] { "get", "Get", "is", "Is", ""};
 
    @Override
    publicObject getValue(ELContext context, Object base, Object property) throws NullPointerException,
            PropertyNotFoundException, ELException {
 
        Object result = null;
        if(base == null) {
            result = getTopLevelProperty(context, property);
            if(result != null&& !(result instanceof Item)) {
                context.setPropertyResolved(true);
            }
        } else{
            result = getBaseProperty(context, base, property);
            if(result == null) {
                String stringProperty = property.toString();
                String friendlyProperty = fixFriendlyProperty(base, stringProperty);
                if(!friendlyProperty.equals(stringProperty)) {
                    result = getBaseProperty(context, base, friendlyProperty);
                }
            }
            if(result != null) {
                context.setPropertyResolved(true);
            }
        }
        returnresult;
    }
 
    publicstatic ELResolver getInstance() {
        if(instance == null) {
            instance= newTridionELResolver();
        }
        returninstance;
    }
 
    privateObject getTopLevelProperty(ELContext context, Object property) {
        FakePageContext pageContext = (FakePageContext) context.getContext(JspContext.class);
        Package _package = pageContext.getPackage();
        Engine engine = pageContext.getEngine();
        String propertyName = property.toString();
        Object result = null;
        if(propertyName.equals("Engine")) {
            result = engine;
        } elseif (propertyName.equals("Package")) {
            result = _package;
        } else{
            result = getEngineOrPackageProperty(engine, _package, property);
        }
        returnresult;
    }
 
    privateObject getEngineOrPackageProperty(ELContext context, Object property) {
        FakePageContext pageContext = (FakePageContext) context.getContext(JspContext.class);
        Package _package = pageContext.getPackage();
        Engine engine = pageContext.getEngine();
        returngetEngineOrPackageProperty(engine, _package, property);
    }
 
    privateObject getEngineOrPackageProperty(Engine engine, Package _package, Object property) {
        Object result = getPackageProperty(_package, property);
        if(result != null) {
            Object engineProperty = getEngineProperty(engine, result);
            if(engineProperty != null) {
                result = engineProperty;
            }
        }
        returnresult;
    }
 
    privateObject getEngineProperty(Engine engine, Object property) {
        Object result = null;
        if(property instanceof Item) {
            Item item = (Item) property;
            String id = item.GetValue("ID");
            if(id == null) {
                return result;
            }
            result = engine.GetObject(item);
        } else{
            String stringProperty = property.toString();
            if(TcmUri.IsValid(stringProperty) || stringProperty.startsWith("/webdav")) {
                result = engine.GetObject(stringProperty);
            }
        }
        returnresult;
    }
 
    privateObject getPackageProperty(Package _package, Object property) {
        Item item = _package.GetByName(property.toString());
        returnitem;
    }
 
    privateObject getBaseProperty(ELContext context, Object base, Object property) {
        Object result = null;
        String propertyName = property.toString();
        if(base instanceof Component) {
            result = getComponentProperty(base, propertyName);
        } elseif (base instanceofRepositoryLocalObject) {
            result = getRepositoryLocalObjectProperty(base, propertyName);
        } elseif (base instanceofRepository) {
            result = getRepositoryProperty(base, propertyName);
        } elseif (base instanceofPackage) {
            result = getEngineOrPackageProperty(context, propertyName);
        } elseif (base instanceofItemFields) {
            Object[] bases = new Object[] { base };
            result = getItemFieldsProperty(bases, propertyName);
            base = bases[0];
        } elseif (base instanceofList) {
            Object[] bases = new Object[] { base };
            result = getListProperty(bases, property);
            base = bases[0];
        } elseif (base instanceofIList) {
            Object[] bases = new Object[] { base };
            result = getIListProperty(bases, property);
            base = bases[0];
        }
        if(result == null) {
            result = getBeanProperty(base, propertyName);
        }
        returnresult;
    }
 
    privateObject getComponentProperty(Object base, String propertyName) {
        Object result = null;
        Component component = (Component) base;
        if(propertyName.equals("Fields")) {
            result = new ItemFields(component.getContent(), component.getSchema());
        } elseif (propertyName.equals("Metadata")) {
            result = new ItemFields(component.getMetadata(), component.getMetadataSchema());
        }
        returnresult;
    }
 
    privateObject getRepositoryLocalObjectProperty(Object base, String propertyName) {
        Object result = null;
        RepositoryLocalObject localObject = (RepositoryLocalObject) base;
        if(propertyName.equals("Metadata")) {
            result = new ItemFields(localObject.getMetadata(), localObject.getMetadataSchema());
        }
        returnresult;
    }
 
    privateObject getRepositoryProperty(Object base, String propertyName) {
        Object result = null;
        Repository repository = (Repository) base;
        if(propertyName.equals("Metadata")) {
            result = new ItemFields(repository.getMetadata(), repository.getMetadataSchema());
        }
        return result;
    }
 
    privateObject getItemFieldsProperty(Object[] bases, String propertyName) {
        Object result = null;
        Object base = bases[0];
        ItemFields itemFields = (ItemFields) base;
        base = bases[0] = itemFields.getItem(propertyName);
        ItemField itemField = (ItemField) base;
        if(itemField.getDefinition().getMaxOccurs() == 1) { // single-value
            result = getBeanProperty(base, "Value");
        } else{ // multi-value
            result = getBeanProperty(base, "ValuesList");
            if(result == null) {
                result = getBeanProperty(base, "Values");
            }
        }
        if(itemField.getDefinition().getMinOccurs() == 0 && result == null) { // optional
            result = "";
        }
        returnresult;
    }
 
    privateObject getListProperty(Object[] bases, Object property) {
        Object result = null;
        Object base = bases[0];
        List<?> list = (List<?>) base;
        intintProperty = 0;
        if(property instanceof Number) {
            Number numberProperty = (Number) property;
            intProperty = numberProperty.intValue();
            result = list.get(intProperty);
        } else{
            base = bases[0] = list.get(intProperty);
        }
        returnresult;
    }
 
    privateObject getIListProperty(Object[] bases, Object property) {
        Object result = null;
        Object base = bases[0];
        IList list = (IList) base;
        intintProperty = 0;
        if(property instanceof Number) {
            Number numberProperty = (Number) property;
            intProperty = numberProperty.intValue();
            result = list.getItem(intProperty);
        } else{
            base = bases[0] = list.getItem(intProperty);
        }
        returnresult;
    }
 
    privateObject getBeanProperty(Object base, String property, Object... methodArguments) {
        Object result = null;
        String propertyName = capitalize(property);
        for(String methodPrefix : methodPrefixes) {
            String methodName = methodPrefix + propertyName;
            Class<? extends Object> clazz = base.getClass();
            Method method = findMethod(clazz, methodName);
            if(method != null) {
                try {
                    result = method.invoke(base, methodArguments);
                    break;
                } catch (Exception e) {}
            }
        }
        returnresult;
    }
 
    privateMethod findMethod(Class<? extendsObject> clazz, String methodName) {
        for(Method method : clazz.getMethods()) {
            if(method.getName().equals(methodName)) {
                return method;
            }
        }
        returnnull;
    }
 
    privateString fixFriendlyProperty(Object base, String property) {
        if(base instanceofRepositoryLocalObject) {
            if(property.equals("Publication")) {
                property = "ContextRepository";
            }
        }
        returnproperty;
    }
}
 
 

TBB of the Week - Render Page Text Block

$
0
0
I thought about starting a new section on my blog -- TBB of the Week. The intention is to publish a new TBB every week that is somehow worthy. Maybe it is a very generic one, or a best practice, or even a bad practice. There should be a description of the TBB, what it does, where/how to use it in a Compound Template. Of course, there will be source code too.

So to start with, the first "TBB of the Week" is going to be Render Page Text Block TBB. I first got this TBB from a colleague of mine, Eric Huiza, so I won't take credit for it.

Name
Render Page Text Block TBB
Type
· Template in .NET Assembly
Description
Used to:
· Publish Text Block scripts (JS or CSS) attached to a Page as metadata;
Notes:
This TBB expects the Page to have metadata field ‘script’ that contains a Component Link to the Text Block Component. The TBB creates an Output item (or overwrites the existing one) with the content of the TextBlock field in the linked Text Block Component.
The Page in this case should not contain any Component Presentations.
Parameters
n/a
Applicable to
Page Templates

I like the idea a lot -- you don't have to create Component Presentations on your Page in order to publish 'static' text assets like JavaScript or CSS. Instead you just attach them to your Page Metadata.

One thing I don't like, is the inability to use the "Where Used" publishing. Namely, if you want to publish the JS/CSS Text Block Component, this setup will not allow that. In a normal situation, where the Component appears on the Page as CP, publishing the Component will publish the Page using it. Not in this case...

The Code

[TcmTemplateTitle("Render Page Text Block TBB")]
publicclassRenderPageTextBlock : ITemplate{

    publicvoid Transform(Engineengine, Package package) {
        ItempageItem = package.GetByName(Package.PageName);
        Pagepage = engine.GetObject(pageItem) asPage;

        ItemFieldsmetaFields = newItemFields(page.Metadata, page.MetadataSchema);
        ComponentLinkFieldtextblockField = metaFields.Where(w => w.Name.Equals("script")).Cast<ComponentLinkField>().ElementAtOrDefault(0);

        if (textblockField != null) {
            Componenttextblock = textblockField.Value;
            ItemFieldsitemFields = newItemFields(textblock.Content, textblock.Schema);

            MultiLineTextFieldtextField = itemFields.Where(w => w.Name.Equals("TextBlock")).Cast<MultiLineTextField>().ElementAt(0);

            Itemoutput = package.GetByName(Package.OutputName);
            if(output == null) {
                package.PushItem(Package.OutputName, package.CreateStringItem(ContentType.Text, textField.Value));
            } else{
                output.SetAsString(textField.Value);
            }
        }
    }
}


Create a Schema using TOM.Net API

$
0
0
I was a little frustrated today for not finding out soon enough about the SchemaFields class. Basically that's the class you need when creating a Schema programmatically using TOM.Net API in SDL Tridion 2011.

Creating the Schema is easy -- just use the new Schema(session, parentTcmUri) constructor, but creating the fields themselves is a bit tricky. You have to use the SchemaFields class, just like you would use ItemFields on a normal Component fields or on any Metadata fields collections.

The code below creates a Schema and adds one RTF field to it and one Keyword based field to its metadata fields:

Session session = newSession();

Schema schema = newSchema(session, newTcmUri("tcm:20-1-2")) {
    Title = "New Schema",
    Description = "Schema Description"
};

SchemaFields fields = newSchemaFields(schema);
fields.Fields.Add(newXhtmlFieldDefinition("NewField") {
    Description = "New Field Description",
    MinOccurs = 1, // mandatory
    Height = 5
});

KeywordFieldDefinition metaField = newKeywordFieldDefinition("MetaField") {
    Description = "New Meta Field Description",
    Category = newCategory(newTcmUri("tcm:20-859-512"), session)
};
metaField.List.Type = ListType.Tree;
fields.MetadataFields.Add(metaField);

schema.Xsd = fields.ToXsd();
schema.Save(true);

Having problems creating a Session object? Have a look at my other post describing how to fix Session creation.

Get Linked Components TBB

$
0
0
This weeks "TBB of the Week" is Get Linked Components TBB. I don't know who exactly wrote it, so I'm not taking credit for it. It is part of the Templating Base project, IIRC.

Name
Get Linked Components TBB
Type
·    Template in .NET Assembly
Description
Used to:
·    Extract Component links from Page metadata or from Component fields;
·    Create Package items from the identified Components;
Notes:
When placed on a Page Template, this generic TBB identifies Component Links from the Page’s Metadata section.
When placed on a Component Template, it identifies Component Links in the current Component’s fields (including RTF and Component Link fields) and metadata fields.
The TBB pushes each identified Component into the Package.
Parameters
n/a
Applicable to
Component Templates and Page Templates

This is a very powerful TBB that I use in almost all my projects. The issue it solves is the following: whenever there a Component Link appears in Component's RTF or link field (or Page metadata), it is impossible using Dreamweaver syntax alone, to access the linked Component fields directly. Instead, this TBB should be used to place the linked Component in the Package and then access its fields.

The Code

[TcmTemplateTitle("Get Linked Components TBB")]
publicclassGetLinkedComponents: ITemplate {

    privateEngine_engine = null;
    privatePackage_package = null;
    privateTemplatingLogger_log = null;
    //-1 = not set, 0 = component, 1 = page
    privateint_renderContext = -1;

    ///<summary>
    /// Executes the transformation
    ///</summary>
    ///<param name="engine">Templating engine (context for the template code)</param>
    ///<param name="package">Transformation context (contains both the inputs and the outputs of the transformation)</param>
    voidITemplate.Transform(Engine engine, Packagepackage) {
        _engine = engine;
        _package = package;
        _log = TemplatingLogger.GetLogger(this.GetType());

        if(IsPage) {
            //Add linked components from metadata
            _log.Info("Scanning Page Metadata for link fields...");
            AddLinkedComponents(GetContextItemFields("meta"), "MetaData.", 0);
        } else{
            //Add both linked components from standard schema and metadata fields
            //_log.Info("Scanning Component for link fields...");
            AddLinkedComponents(GetContextItemFields("link"), string.Empty, 0);
            //_log.Info("Scanning Component Metadata for link fields...");
            AddLinkedComponents(GetContextItemFields("meta"), "MetaData.", 0);
            //Fix for RenderComponentField
            ItemMainComponent = _package.GetByName("Component");
            _package.Remove(MainComponent);
            _package.PushItem("Component", MainComponent);
        }
    }

    ///<summary>
    /// Add linked components to the package from the given fields
    ///</summary>
    ///<param name="fields">The ItemFields to search for component links</param>
    ///<param name="packageKeyPostfix">Postfix to add to package item name</param>
    privatevoidAddLinkedComponents(ItemFields fields, string packageKeyPrefix, intdepth) {
        if(fields != null) {
            HashtableitemFieldCounter = newHashtable();
            foreach(ItemField itemField in fields) {
                StringitemFieldName = itemField.Name.ToString();

                if(!itemFieldCounter.ContainsKey(itemFieldName)) {
                    itemFieldCounter[itemFieldName] = 0;
                } else{
                    itemFieldCounter[itemFieldName] = (int)itemFieldCounter[itemFieldName] + 1;
                }

                if(itemField isComponentLinkField) {
                    ComponentLinkFieldfield = itemField asComponentLinkField;

                    if(depth > 0) {
                        _log.Info("Found 1 deep: " + field.Name);
                    }

                    if(field != null&& ((field.Definition.MaxOccurs == 1 && field.Value != null) || field.Values != null)) {
                        // --------------------------
                        Item item = null;
                        if (field.Definition.MaxOccurs == 1 && field.Value != null) {
                            //If the field is single value, add it to the package as a component
                            //_log.Info("Found single value link field: " + field.Name);
                            item = _package.CreateTridionItem(ContentType.Component, field.Value.Id);
                        } else {
                            //Otherwise create a uri list of all values
                            IList<TcmUri> uriList = newList<TcmUri>();
                            foreach (ComponentlinkedComp in field.Values) {
                                uriList.Add(linkedComp.Id);
                            }
                            if(uriList.Count > 0) {
                                //_log.Info("Found multivalue link field: " + field.Name);
                                item = _package.CreateComponentUriListItem(ContentType.ComponentArray, uriList);
                            }

                        }
                        if (item != null) {
                            _package.PushItem(packageKeyPrefix + field.Name, item);
                        }
                        // --------------------------
                    }
                }
                    /* */
                elseif (itemField isEmbeddedSchemaField) {
                    EmbeddedSchemaFieldfield = itemField asEmbeddedSchemaField;
                    ItemFieldsfieldValues = field.Values asItemFields;
                    /* * /
                    if (field.Definition.MaxOccurs == 1 && field.Value != null)
                    {
                        //If the field is single value, add it to the package as a component
                        //_log.Info("Found single value Embedded field: " + field.Name + " -- " + field.Value.ToString());
                        //item = _package.CreateTridionItem(ContentType.Component, field.Value);
                    }
                    else
                    /* */
                    if(field.Values.Count > 0) {
                        //Otherwise create a uri list of all values
                        foreach(ItemFields linkedComp in field.Values) {
                            foreach (ItemFieldiField in linkedComp) {
                                String lItemFieldName = itemFieldName + "." + iField.Name.ToString();

                                if(!itemFieldCounter.ContainsKey(lItemFieldName)) {
                                    itemFieldCounter[lItemFieldName] = 0;
                                } else {
                                    itemFieldCounter[lItemFieldName] = (int)itemFieldCounter[lItemFieldName] + 1;
                                }

                                String newPackageKeyPrefix = packageKeyPrefix + lItemFieldName + itemFieldCounter[lItemFieldName];

                                // pass the fieldValues if its not null, and go only 1 deep
                                //_log.Info("Possible: " + newPackageKeyPrefix);
                                //_log.Info("???: " + field.Values.ToString());
                                //_log.Info("Found object: " + itemField.GetType().ToString());

                                if (iField isComponentLinkField) {
                                    ComponentLinkField convIField = iField asComponentLinkField;

                                    if(convIField != null&& ((convIField.Definition.MaxOccurs == 1 && convIField.Value != null) || convIField.Values != null)) {
                                        Item convItem = null;
                                        if (convIField.Definition.MaxOccurs == 1 && convIField.Value != null) {
                                            //If the field is single value, add it to the package as a component
                                            //_log.Info("Found single value link field: " + field.Name);
                                            convItem = _package.CreateTridionItem(ContentType.Component, convIField.Value.Id);
                                        } else {
                                            //Otherwise create a uri list of all values
                                            IList<TcmUri> convUriList = newList<TcmUri>();
                                            foreach (ComponentconvLinkedComp in convIField.Values) {
                                                if(convLinkedComp != null) {
                                                    convUriList.Add(convLinkedComp.Id);
                                                    _log.Info("Found multivalue link field: " + convLinkedComp);
                                                }
                                            }
                                            if (convUriList.Count > 0) {
                                                convItem = _package.CreateComponentUriListItem(ContentType.ComponentArray, convUriList);
                                            }

                                        }
                                        if (convItem != null) {
                                            _package.PushItem(newPackageKeyPrefix, convItem);
                                        }
                                    }
                                } elseif (iField isEmbeddedSchemaField) {
                                    AddLinkedComponents(((EmbeddedSchemaField)iField).Value, newPackageKeyPrefix, depth + 1);
                                }
                            }
                        }

                    }
                    /* */

                } // end else if (itemField is EmbeddedSchemaField)
                else{
                    //_log.Info("Found itemField: " + itemField.Name.ToString());
                }
            }
        }
    }

    ///<summary>
    /// Get the ItemFields for the context item
    ///</summary>
    ///<returns></returns>
    privateItemFieldsGetContextItemFields(string type) {
        ItemFieldsfields = null;
        if(IsPage) {
            Itemitem = _package.GetByName("Page");
            if(item != null) {
                stringpid = item.GetValue("ID");
                Pagepage = _engine.GetObject(pid) asPage;
                if(page.Metadata != null)
                    fields = newItemFields(page.Metadata, page.MetadataSchema);
            }
        } else{
            Itemitem = _package.GetByName("Component");
            if(item != null) {
                stringcid = item.GetValue("ID");
                Componentcomp = _engine.GetObject(cid) asComponent;
                if(type == "meta"&& comp.Metadata != null) {
                    fields = newItemFields(comp.Metadata, comp.MetadataSchema);
                } elseif (type == "link") {
                    fields = newItemFields(comp.Content, comp.Schema);
                }
            }
        }
        returnfields;
    }

    ///<summary>
    /// True if the rendering context is a page, rather than component
    ///</summary>
    privateboolIsPage {
        get {
            if(_renderContext == -1) {
                if(_engine.PublishingContext.ResolvedItem.Item isPage)
                    _renderContext = 1;
                else
                    _renderContext = 0;
            }
            if(_renderContext == 1)
                returntrue;
            else
                returnfalse;
        }
    }
}

RenderComponentPresentation Tag for Java Mediator

$
0
0
In order to write the first complete Page Template for the Java Mediator for SDL Tridion Templating, I needed a Java tag similar to the @@RenderComponentPresentation@@ from Dreamweaver Templating.

The logic to write such a tag is pretty simple. However, I wanted to have a versatile tag that would not only accept a Component and a Component Template, but also the possibility to specify some 'wildcards' or, more exactly, some Regular Expressions to allow the rendering of several Component Presentations based on different criterias.

The criterias I implemented are:
  • Component title: in the current Page, matches each Component Presentation Component title on this Regular Expression and if a match is found, the CP is displayed;
  • Component Template title: same as above, but matching is done on the Component Presentation's Component Template title;
  • Schema title: same as above, only matches the Schema title of the Component in the Component Presentation;
Additionally, I wanted to have the choice of sharing the Package object a Page Template would interact with, in the Component Template. Therefore, by using an 'inherit' boolean attribute on the tag, I can specify whether the Package from the Page Template would be 'inherited' into the Component Templates.

Examples

A 'Real' Page Template

The JSP TBB below iterates over each Component Presentation in the current Page in the Package and triggers its rendering:

<%@taglib prefix="c" uri="http://java.sun.com/jsp/jstl/core"%>
<%@taglib prefix="t" uri="http://www.mitza.net/java/mediator/tag"%>
<html>
<head>
    <title>${Page.Title}</title>
</head>
<body>
    <c:forEach var="cp" items="${Page.ComponentPresentationsList}">
        <t:renderComponentPresentation cp="${cp}" />
    </c:forEach>
</body>
</html>

Specify Component and Component Template

The code below outputs each Component Presentation in the Page:

<c:forEach var="cp" items="${Page.ComponentPresentationsList}">
    <t:renderComponentPresentation component="${cp.Component}"
        componentTemplate="${cp.ComponentTemplate}" />
</c:forEach>

Attributes component and componentTemplate can in this case Strings representing TcmUri values, or actual TcmUri objects.

Specify Schema

Display all Component Presentations with Component based on the Schema with TcmUri tcm:1-2-8:

<t:renderComponentPresentation schema="tcm:1-2-8" />

Note the tag above should not be placed inside a for loop -- it will display all Component Presentations that match the specified Schema.

Combine Schema and Component Template

Combinations of attributes are possible. The more attributes are specified, the more restrictive the matching Component Presentations will become (i.e. the operator between several attributes is AND):

<t:renderComponentPresentation schema="tcm:1-2-8" componentTemplate="tcm:1-3-32" />

Inherit Page Template Package into Component Templates

Specifying attribute inherit="true" (default, false), the Package object used during the rendering of the Page Template will be re-used (inherited) into the Component Template rendering. This means that potentially, several Component Templates can share the same Package.

This will of course work also with other, non Java, mediators. For example, adding a C# TBB, the Package available to it could contain items inserted by previous TBBs. This way it is no longer needed to use the ContextVariables to pass values and objects from Page Template to Component Templates.

<c:forEach var="cp" items="${Page.ComponentPresentationsList}">
    <t:renderComponentPresentation cp="${cp}" inherit="true" />
</c:forEach>

Specifying with Regular Expressions

All three attributes -- component, componentTemplate and schema, allow Regular Expressions. They will be matched on their corresponding values of the Component Presentation Component title, Component Template title and Component's Schema title. The operator between the attributes is AND. The tag renders all Component Presentations matched; no need placing it inside a loop.

<t:renderComponentPresentation schema="Article" /> - displays all CPs where the Component's Schema title contains word "Article"

<t:renderComponentPresentation componentTemplate="^Summary News$" /> - displays all CPs where the Component Template's title matches exactly "Summary News"

<t:renderComponentPresentation component="^Quarterly Report.*2012" componentTemplate="tcm:1-3-32" /> - displays all CPs where the Component's title starts with "Qarterly Report" and contains the string "2012" and are using Component Template tcm:1-3-32

Code

publicclass RenderComponentPresentationTag extends SimpleTagSupport {

    private TcmUri component;
    private Pattern componentPattern;
    private TcmUri componentTemplate;
    private Pattern componentTemplatePattern;
    private TcmUri schema;
    private Pattern schemaPattern;
    privatebooleaninherit;

    private Log log = Log.getInstance();

    @Override
    publicvoid doTag() throws JspException, IOException {
        log.debug("RenderComponentPresentationTag.ByTcmUri: component=%s, componentTemplate=%s", component,
                componentTemplate);

        if (component != null&& componentTemplate != null) {
            renderComponentPresentation(component, componentTemplate);
        } else {
            FakePageContext pageContext = getJspContext();
            Engine engine = pageContext.getEngine();
            Package _package = pageContext.getPackage();

            Item pageItem = _package.GetByName("Page");
            if (pageItem == null) {
                thrownew MediatorException("Cannot find item Page in the Package. Are you in Page Template context?");
            }

            Page page = (Page) engine.GetObject(pageItem);
            renderComponentPresentations(page);
        }
    }

    @Override
    protected FakePageContext getJspContext() {
        return (FakePageContext) super.getJspContext();
    }

    publicvoid setComponent(Object component) {
        if (component instanceof TcmUri) {
            this.component = (TcmUri) component;
        } elseif (component instanceof Component) {
            this.component = ((Component) component).getId();
        } else {
            String value = component.toString();
            if (TcmUri.IsValid(value)) {
                this.component = new TcmUri(value);
            } else {
                componentPattern = Pattern.compile(value);
            }
        }
    }

    publicvoidsetComponentTemplate(Object componentTemplate) {
        if (componentTemplate instanceof TcmUri) {
            this.componentTemplate = (TcmUri) componentTemplate;
        } elseif (componentTemplate instanceof ComponentTemplate) {
            this.componentTemplate = ((ComponentTemplate) componentTemplate).getId();
        } else {
            String value = componentTemplate.toString();
            if (TcmUri.IsValid(value)) {
                this.componentTemplate = new TcmUri(value);
            } else {
                componentTemplatePattern = Pattern.compile(value);
            }
        }
    }

    publicvoidsetCp(ComponentPresentation cp) {
        this.component = cp.getComponent().getId();
        this.componentTemplate = cp.getComponentTemplate().getId();
    }

    publicvoid setSchema(Object schema) {
        if (schema instanceof TcmUri) {
            this.schema = (TcmUri) schema;
        } elseif (schema instanceof Schema) {
            this.schema = ((Schema) schema).getId();
        } else {
            String value = schema.toString();
            if (TcmUri.IsValid(value)) {
                this.schema = new TcmUri(value);
            } else {
                schemaPattern = Pattern.compile(value);
            }
        }
    }

    publicvoid setInherit(boolean inherit) {
        this.inherit = inherit;
    }

    privatevoidrenderComponentPresentations(Page page) throws IOException {
        for (ComponentPresentation cp : page.getComponentPresentationsList()) {
            boolean isRender = true;

            Component cpComponent = cp.getComponent();
            ComponentTemplate cpComponentTemplate = cp.getComponentTemplate();
            Schema cpSchema = cpComponent.getSchema();

            TcmUri cpComponentUri = cpComponent.getId();
            TcmUri cpComponentTemplateUri = cpComponentTemplate.getId();

            if (component != null) {
                isRender = component.equals(cpComponentUri);
            }

            if (isRender && componentPattern != null) {
                isRender = componentPattern.matcher(cpComponent.getTitle()).find();
            }

            if (isRender && componentTemplate != null) {
                isRender = componentTemplate.equals(cpComponentTemplateUri);
            }

            if (isRender && componentTemplatePattern != null) {
                isRender = componentTemplatePattern.matcher(cpComponentTemplate.getTitle()).find();
            }

            if (isRender && schema != null) {
                isRender = schema.equals(cpSchema.getId());
            }

            if (isRender && schemaPattern != null) {
                isRender = schemaPattern.matcher(cpSchema.getTitle()).find();
            }

            if (isRender) {
                renderComponentPresentation(cpComponentUri, cpComponentTemplateUri);
            }
        }
    }

    privatevoidrenderComponentPresentation(TcmUri component, TcmUri componentTemplate) throws IOException {
        log.debug(
                "RenderComponentPresentationTag.renderComponentPresentation: component=%s, componentTemplate=%s, inherit=%b",
                component, componentTemplate, inherit);
        String cpContent;
        FakePageContext pageContext = getJspContext();
        Engine engine = pageContext.getEngine();
        JspWriter out = pageContext.getOut();

        if (inherit) {
            MediatorEngine mediatorEngine = MediatorEngine.Wrap(engine);
            cpContent = mediatorEngine.RenderComponentPresentation(component, componentTemplate);
        } else {
            cpContent = engine.RenderComponentPresentation(component, componentTemplate);
        }

        out.print(cpContent);
    }
}



Create and Publish Page for Component in Workflow using Core Service

$
0
0
It seems like this use case keeps coming back time and time again. The requirement: when a new Component is in workflow, part of the approval workflow on the Component should be an automatic Page generation (where the Component is placed on the Page) and possibly publishing of the Page for the purpose of previewing the new Component in a Page context. All this while Component is in workflow, so approver can actually see how the Component looks like before approving/rejecting it.

The only issue, as described in the previous post, is that it is not possible to add a v0.x Component to a Page, when the Page is in a child Publication. The solution is to do all this programmatically.

In the following example, I use the Core Service to create a new Page (with values taken from a Folder metadata, for example) and place the v0.x Component on it.

using (CoreServiceSessioncoreService = newCoreServiceSession())
{
    if (component.Version < 1)
    {
        TcmUri pageTcmUri = coreService.CreatePage(title, fileName, sgTcmUri, ptTcmUri, localComponentTcmUri, ctTcmUri);
        PublisherHelper publisherHelper = newPublisherHelper();
        publisherHelper.Publish(pageTcmUri, targetTcmUri, true);
    }
}

The Page is first created, then published (as described in my earlier posts about "How to Publish Stuff Programmatically"). The CoreServiceSession method CreatePage is the following:

publicTcmUriCreatePage(String title, String fileName, TcmUrisgTcmUri, TcmUri ptTcmUri, TcmUri componentTcmUri, TcmUrictTcmUri)
{
    PageData pageData = newPageData();
    pageData.Id = TcmUri.UriNull.ToString();
    pageData.Title = title;
    pageData.FileName = fileName;

    LinkToPageTemplateData linkToPageTemplateData = newLinkToPageTemplateData();
    linkToPageTemplateData.IdRef = ptTcmUri.ToString();
    pageData.PageTemplate = linkToPageTemplateData;
    pageData.IsPageTemplateInherited = false;

    LinkToOrganizationalItemDatalinkToOrganizationalItemData = newLinkToOrganizationalItemData();
    linkToOrganizationalItemData.IdRef = sgTcmUri.ToString();
    LocationInfo locationInfo = newLocationInfo();
    locationInfo.OrganizationalItem = linkToOrganizationalItemData;
    pageData.LocationInfo = locationInfo;

    ComponentPresentationData[] cps = newComponentPresentationData[1];
    cps[0] = newComponentPresentationData();
    LinkToComponentData linkToComponentData = newLinkToComponentData();
    linkToComponentData.IdRef = componentTcmUri.ToString();
    cps[0].Component = linkToComponentData;
    LinkToComponentTemplateDatalinkToComponentTemplateData = newLinkToComponentTemplateData();
    linkToComponentTemplateData.IdRef = ctTcmUri.ToString();
    cps[0].ComponentTemplate = linkToComponentTemplateData;
    pageData.ComponentPresentations = cps;

    IdentifiableObjectData objectData = _coreServiceClient.Create(pageData, ReadOptions);

    returnnewTcmUri(objectData.Id);
}


Writing to Multiple File Systems from the Same Deployer

$
0
0
This topic is not new. It comes back regularly, pretty much with every single enterprise client I have implemented. "Why do we need different Deployers for different file systems?". "Why can't just Tridion publish to different file systems?". And so on...

Recently, it came up again, so I setup a small PoC to see how feasible it is to write a Storage Extension (in SDL Tridion 2011SP1) that would perform the typical CRUD operations a Deployer would perform, only on multiple file systems.

The idea behind the storage extension is to have several file systems defined in the cd_storage_conf.xml that would be grouped under one logical name. Then have an item type mapping (e.g. Page) that would point to the group of file system. The goal is to have that item type created, updated, removed, etc on each of the file systems defined in the group.

The cd_storage_conf.xml Storages node would look something like this:


    <StorageType="filesystem"Class="com.tridion.storage.filesystem.FSDAOFactory"
        Id="MultiFS"defaultFilesystem="false">
        <RootPath="not-used"/>
    </Storage>

    <StorageType="filesystem"Class="com.tridion.storage.filesystem.FSDAOFactory"
        Id="FileRoot1"defaultFilesystem="false">
        <RootPath="C:\Temp\Root1"/>
    </Storage>

    <StorageType="filesystem"Class="com.tridion.storage.filesystem.FSDAOFactory"
        Id="FileRoot2"defaultFilesystem="false">
        <RootPath="D:\Temp\Root2"/>
    </Storage>

    <StorageGroupId="MultiFS">
        <StorageId="FileRoot1"/>
        <StorageId="FileRoot2"/>
    </StorageGroup>


Item mapping for the Page type would point to the MultiFS id:


    <ItemTypesdefaultStorageId="brokerdb"cached="true">
        <ItemtypeMapping="Page"cached="false"storageId="MultiFS"/>
    </ItemTypes>


In order to make the setup-above work, I had to create my own DAO (Data Access Object) storage extension. There is a reference to the DAO bundle definition in the cd_storage_conf.xml:


    <StorageBindings>
        <Bundlesrc="multifs_dao_bundle.xml"/>
    </StorageBindings>


The file multifs_dao_bundle.xml contains the definition of my custom DAO:


<StorageDAOBundles>
    <StorageDAOBundletype="filesystem">
        <StorageDAOtypeMapping="Page"
            class="com.tridion.extension.multifs.MultiFSDAO"/>
    </StorageDAOBundle>
</StorageDAOBundles>


The whole logic lies in the class MultiFSDAO, which acts like a wrapper around an array of com.tridion.storage.filesystem.FSPageDAO objects. A helper configuration class reads the StorageGroup node from cd_storage_conf.xml and then reads the Root/@path (i.e. storage location) value for each referenced Storage node.


publicclass MultiFSDAO extends FSBaseDAO implements PageDAO {

    private FSPageDAO[] pageDAOs;

    publicMultiFSDAO(String storageId, String storageName, File storageLocation) {
        super(storageId, storageName, storageLocation);
        createDAOs(storageId, storageName, null);
    }

    publicMultiFSDAO(String storageId, String storageName, File storageLocation, FSEntityManager entityManager) {
        super(storageId, storageName, storageLocation, entityManager);
        createDAOs(storageId, storageName, entityManager);
    }

    privatevoidcreateDAOs(String storageId, String storageName, FSEntityManager entityManager) {
        MultiFSConfiguration configuration = MultiFSConfiguration.getInstance();
        Map<String, String> storageGroups = configuration.getStorageGroups();
        String groups = storageGroups.get(storageId);
        if (groups == null) {
            groups = storageId;
        }

        String storageIds[] = groups.split(",");
        pageDAOs = newFSPageDAO[storageIds.length];
        Map<String, String> storageLocations = configuration.getStorageLocations();

        for (int i = 0; i < storageIds.length; i++) {
            String id = storageIds[i];
            String location = storageLocations.get(id);

            if (entityManager == null) {
                pageDAOs[i] = new FSPageDAO(id, storageName, new File(location));
            } else {
                pageDAOs[i] = new FSPageDAO(id, storageName, new File(location), entityManager);
            }
        }
    }


Once we have the array of FSPageDAO objects, it's a simple matter of just implementing the CRUD operations on the collection of FSPageDAOs.


publicvoidcreate(CharacterData page, String relativePath) throwsStorageException {
    for (PageDAO pageDAO : pageDAOs) {
        pageDAO.create(page, relativePath);
    }
}

publicCollection<CharacterData> findAll(int publicationId) throwsStorageException {
    Collection<CharacterData> result = null;
    for (PageDAO pageDAO : pageDAOs) {
        result = pageDAO.findAll(publicationId);
    }

    return result;
}

public CharacterData findByPrimaryKey(int publicationId, int pageId) throwsStorageException {
    CharacterData result = null;
    for (PageDAO pageDAO : pageDAOs) {
        result = pageDAO.findByPrimaryKey(publicationId, pageId);
    }

    return result;
}

publicvoid remove(int publicationId, int pageId, String relativePath) throws StorageException {
    for (PageDAO pageDAO : pageDAOs) {
        pageDAO.remove(publicationId, pageId, relativePath);
    }
}

publicvoidupdate(CharacterData page, String originalRelativePath, String newRelativePath) throwsStorageException {
    for (PageDAO pageDAO : pageDAOs) {
        pageDAO.update(page, originalRelativePath, newRelativePath);
    }
}


The big disclaimer: the code-above is by no means production ready -- I just used it for a small PoC. I have not tested it thoroughly either. It does deploy pages to multiple file systems, but I did not try any corner cases. I don't even think it works in all scenarios: think about here at transactionality, or what happens (or should happen) if a destination failed. The deploy will not be rolled back. What happens upon unpublish of a previously failed published? And the questions could go on... Use at your own discretion!

Referencing the Tridion.ContentManager.config

$
0
0
There are times when you need to create a new Session (i.e. Tridion.ContentManager.Session) in TOM.NET. This is considered bad practice, for reasons I'm not going to dwelve into here.

By default, when attempting to create a new instance (e.g. Session session = new Session()),  you get a run-time exception: The type initializer for 'Tridion.Localization.StringResourceManager' threw an exception. This can be fixed by referring to the Tridion.ContentManager.config file from your application's app.config, web.config, or DLL.config.

Make sure your config contains at least the following lines:

<?xmlversion="1.0"encoding="utf-8" ?>
<configuration>
    <configSections>
        <sectionname="tridionConfigSections"type="Tridion.Configuration.ConfigurationSections, Tridion.Common, Version=3.0.0.211, Culture=neutral, PublicKeyToken=349a39f202fa9b53" />
    </configSections>
    <tridionConfigSections>
        <sections>
            <clear />
            <addfilePath="C:\Program Files (x86)\Tridion\config\Tridion.ContentManager.config" />
        </sections>
    </tridionConfigSections>
    <startup>
        <supportedRuntimeversion="v4.0" />
        <supportedRuntimeversion="v2.0.50727" />
    </startup>
</configuration>
 
In my case C:\Program Files (x86)\Tridion is my %TRIDION_HOME%. Make sure you use the correct path on your system.

 

How to Create a Folder Structure with the Core Service?

$
0
0
This topic has been asked recently on StackOverflow. The question was how to create a Folder (or actually a series of nested Folders) like /aaa/bbb/ccc/ddd using the Core Service in Tridion 2011SP1?

The answer is you have to create each individual Folder as a sub-folder of an existing parent. This means the approach is very well suited to be handled with a recursive call.

I created a method called GetOrCreate(path) that checks if the path exists. If it does, it returns the FolderData object. Otherwise, it splits the path into a parent path and a new folder name and applies recursion on the parent path. On the way out of the recursion, the new folders are created as children of the (by now) existing parents.

privateFolderDataGetOrCreateFolder(string folderPath, SessionAwareCoreServiceClient client)
{
    ReadOptions readOptions = newReadOptions();

    if (client.IsExistingObject(folderPath))
    {
        return client.Read(folderPath, readOptions) asFolderData;
    }
    else
    {
        int lastSlashIdx = folderPath.LastIndexOf("/");
        string newFolder = folderPath.Substring(lastSlashIdx + 1);
        string parentFolder = folderPath.Substring(0, lastSlashIdx);
        FolderData parentFolderData = GetOrCreateFolder(parentFolder, client);
        FolderData newFolderData = client.GetDefaultData(ItemType.Folder, parentFolderData.Id) asFolderData;
        newFolderData.Title = newFolder;

        return client.Save(newFolderData, readOptions) asFolderData;
    }
}

Note: the method does not validate the folderPath. Rather, it expects it to be syntactically correct and its stem to represent a valid and existing Publication WebDAV URL.

Call this method using a construct like this:

    FolderData folderData = GetOrCreateFolder(
        "/webdav/020 Content/Building Blocks/aaa/bbb/ccc",
        new SessionAwareCoreServiceClient());

You cal also use a CoreServiceSession, as described in my earlier post.

Publishing from Template Code Using an Impersonated User

$
0
0
This topic touches on hot Tridion practices, both of them debatable:
  • Publishing from Template code;
  • Impersonating a user in Template code;

I won't go into the debate (maybe I'll write about it at a later stage). I will just say that I don't find it bad practice to impersonate in templates, hence the code and topic below.

The use case for "publishing from template code" comes mainly when dealing with Multimedia Components that need to be published also as Dynamic Component Presentations (DCPs). If they weren't publish from template code, you would either have to publish them manually or from some event system. I think both alternatives are clumsy and less suitable than the publish from template code.

So, my requirement is to issue a Publish on a given Tridion item from template code. The publish should have the same properties as the original item (the one currently being rendered by templates). So, the same target, user, and priority.

First, we need to retrieve the "Current Publish Transaction". As I mentioned in my earlier post, this is not possible in Tridion 2011SP1 using the API. Rather, you can use the little hack I presented in "Get Current Publish Transaction". Once we have the PublishTransaction, we can use the properties .Creator and .PublishPriority to get the user and priority, respectively.

Additionally, in order to retrieve the current PublicationTarget, we can use engine.PublishingContext.PublicationTarget.

publicvoid Publish(Engine engine, StringtcmUri, User user, PublishPrioritypriority)
{
    Session session = newSession(user.Title);

    PublishInstruction publishInstruction = newPublishInstruction(session);
    RenderInstruction renderInstruction = newRenderInstruction(session);
    renderInstruction.RenderMode = RenderMode.Publish; // work around. needs to be specified for binaries.
    publishInstruction.RenderInstruction = renderInstruction;

    List<IdentifiableObject> items = newList<IdentifiableObject>() { session.GetObject(tcmUri) };
    List<PublicationTarget> targets = newList<PublicationTarget>() { engine.PublishingContext.PublicationTarget };
    PublishEngine.Publish(items, publishInstruction, targets, priority);

    session.Dispose();
}

Note: the code above is not production ready, rather it is just an example. It is not checking whether the render mode is Publish or Preview. Also, for performance reasons, it should check whether the binary has already been queued up for publishing, and if so, then don't queue it again.

Using the code-above is rather simple:

    PublishTransaction currentTransaction = TemplateUtils.GetPublishTransaction(engine);
    TemplateUtils.Publish(engine, itemUri, currentTransaction.Creator, currentTransaction.Priority);

The trick when publishing with an impersonated user is to create the PublishInstruction and RenderInstruction using the impersonated Session. Additionally, the item to publish has to be also retrieved with the impersonated Session.

Finally, in order for the impersonation to work from template code, the SYSTEM user has to be allowed to impersonate in Tridion. The reason for this is the user executing the template code is by default the SYSTEM user (i.e. the user running the Tridion Content Manager Publisher service).

Open the SDL Tridion Content Manager configuration MMC snap-in and expand nodes SDL Tridion Content Manager / Impersonation Users. Add impersonation user NT AUTHORITY\SYSTEM with user type Windows.

Shutdown COM+ application SDL Tridion Content Manager under Component Services / Computers / My Computer / COM+ Applications. Restart Transport and Publisher services.

Why Should You Use the /uploadpdb Option with TcmUploadAssembly.exe ?

$
0
0
A Bit of Background

TcmUploadAssembly.exe is a utility program that allows you to upload a DLL into the SDL Tridion Content Manager. It does the following:
  • Create a TBB (of type .NET Assembly) that holds the actual .NET DLL;
  • Create a TBB (of type C# Code Fragment) for each class in the .NET DLL that implements the ITemplate interface;
  • Optionally, creates Parameter Schemas and associates them with the individual template TBBs;
  • Optionally, uploads the PDB corresponding to the .NET DLL;
The following command line arguments are accepted:

Usage: TcmUploadAssembly [options] [ConfigurationFilePath] [AssemblyPath]
  ConfigurationFilePath    The location where the configuration file for this
                           tool can be found.
  AssemblyPath             The location of the assembly to be uploaded.

Basic Options:
  /verbose                 Output extra debug information.
  /help                    Print this message.
Overriding Options (overriding either assembly or configuration file values):
  /folder:tcmuri           Override the folder to store the assembly
                           specified in the assembly.
  /targeturl:cmsurl        Override the target url of the CMS
                           specified in the configuration file.
  /username:nameofuser     Override the username to use in authentication
                           with the web service.
  /password:clearpassword  Override the password to use in authentication
                           with the web service.
  /uploadpdb:true|false    Override whether the PDB should be uploaded from
                           together with the assembly.
  /timeout:in seconds      Override the time it needs to upload the assembly
                           to the web service.


Why the PDB?

To put it simply, without the PDB, you won't be able to debug your code. A PDB file contains information about the source file names with line numbers and the local variable names. So, without a PDB, you won't know in which file the particular class is and at which line your execution pointer is. You will only know the name of the class and method you are in. That is not enough information a debugger can go by.

You can find a techie, yet excellent article here PDB Files: What Every Developer Must Know.

Without a PDB, a stack trace would look something like this (I'm attempting a division by zero in my code to yield an exception):

Attempted to divide by zero.
   at CwaReferenceImplementation.Templates.Generic.FormatDate.Transform(Engine engine, Package package)

While with the PDB present, the exact same code shows the following stack trace:

Attempted to divide by zero.
   at CwaReferenceImplementation.Templates.Generic.FormatDate.Transform(Engine engine, Package package) in C:\Reference Implementations\CWA Reference Implementation\Tridion Templates\Reference Implementation Templates\Generic\FormatDate.cs:line 25

TcmUploadAssembly will upload the PDB either to C:\temp or to %TRIDION_CM_HOME%\temp.

Got org.hibernate.hql.ast.HqlToken Exception? Here's the Solution!

$
0
0
Typically when you get the org.hibernate.hql.ast.HqlToken ClassNotFoundException, you are installing Tridion Content Delivery on WebLogic (or, at least, that's when I encountered it all of the time).

The actual stack trace might look something like:
java.lang.IllegalArgumentException: org.hibernate.QueryException: ClassNotFoundException: org.hibernate.hql.ast.HqlToken
...
    at org.hibernate.ejb.AbstractEntityManagerImpl.throwPersistenceException(AbstractEntityManagerImpl.java:601)
    at org.hibernate.ejb.AbstractEntityManagerImpl.createQuery(AbstractEntityManagerImpl.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
...

Solution: luckily, this is a known issue and is caused by a conflict in classes and class loaders of WebLogic. So, in order to fix it, you should remove stax-api.jar from your application's class-path (or WEB_INF/lib/).

Also, you might set your WEB_INF classes as preferred.

The solution is described in detail on the SDLLiveContent documentation portal under publication SDL Tridion 2011SP1 / Installing SDL Tridion / Installing the Content Delivery server / Installing server roles in Java/JSP / Installing a server role as a JSP Web application - universal procedure (deep link here). Access to documentation portal can be gained from www.sdltridionworld.com


Slow DB Performance on Content Manager Writes

$
0
0

The issue below took me a good 6 hours to fix. Hopefully it will save somebody that time, so here it is:

My Tridion 2011 SP1 HR1 Content Manager was getting hit by a very peculiar performance issue – when saving, creating or deleting any item, the operation would be very slow (5+ seconds). This was on a controlled environment (my VMWare) with nobody else on the system. The issue was manifesting in both the CME and with API (Core Service and TOM.NET).

After checking the usual culprits (disable Event Systems, disable CME extensions, execute sp_updatestats, rebuild all tables index, repair SDL Tridion CM, uninstall/install CM), I finally noticed in the Tridion event log the following line (in fact there were hundreds of these warnings):

Unable to notify "T2011SP1". Reason: The requested name is valid, but no data of the requested type was found
Component: Tridion.ContentManager
Errorcode: 0
User: T2011SP1HR1\MTSUser

The host T2011SP1 is the old hostname of my VMWare. I make a habit that I rename the servers names to reflect the version of Tridion I’m running. This immediately led me to go to my Tridion_cm database and delete the entries from QUEUE_CONSUMERS and QUEUE_FILTERS tables that have HOST = ‘T2011SP1’. A less drastic measure would have been to simply set their respective column IS_ONLINE = 0.

Restarted my Tridion * services, shutdown the Tridion COM+ application and magically performance was back as it used to. Now I can batch update existing items via the Core Service in about ~0.25s per item.

What seemed very strange was that the ‘write’ operations were being notified to the other CM instances present in the QUEUE_CONSUMERS table. That table contains messages of type Publish, Deployer, Workflow, and Search. I guess it was the Search that was being updated… not sure.

RenderedItem Metadata and Instruction

$
0
0
Ever wondered what the methods AddMetadata and AddInstruction do in the RenderedItem class? Did you ever noticed they were there? I’m talking about Tridion.ContentManager.Publishing.Rendering.RenderedItem and the methods:

  • public void AddMetadata(XmlElement metadata)
  • public void AddInstruction(InstructionScope scope, XmlElement instruction)

They were introduced a while back (IIRC, R5.2) and they are responsible with sending metadata from the Content Manager to the Content Delivery, or more exactly to the Deployer. This is a great idea and it’s unfortunate very few of us know about them. What’s even more unfortunate is that this is your typical Tridion half-implementation. Namely, while there is this nice API to send metadata to the Deployer, there is no API whatsoever to read it once it reaches the Deployer. You are on your own. But, let’s see what they do.

AddMetadata(XmlElement metadata)

Documentation states: "The metadata parameter can contain any valid XML structure which can be processed at the Content Delivery side. [...] When a Page with a PageTemplate is rendered this added metadata is available on the Page at the Content Delivery side. When a Component with a ComponentTemplate is rendered this added metadata is available on the Page when the ComponentPresentation is embedded, in case of a Dynamic ComponentPresenation the added metadata is available on the ComponentPresentation at the Content Delivery Side."

What is unclear is the "can be processed at the Content Delivery side". This should read, "can be processed during Deployment, in either a Deployer processor/module or in a Storage extension". This metadata is available as XML inside certain files (we'll see exactly in which ones) in the transport package and cannot be accessed after the Deployment/Publishing finishes. This metadata does not end up in any Content Delivery database tables or file system. Also, this information is only available while publishing and not for unpublish.

In the example below, I'm setting the Metadata XML node of a Folder as RenderedItem metadata. Of course, any XMLElement would do, I am simply using an easy way to get some meaningful metadata and attach it. The Folder metadata contains an Embedded Schema field "CDNSettings" that has two sub-fields: "CDN" and "UrlBasePath". The TOM.NET code looks like this:

Folder folder = engine.GetObject("tcm:14-6-2") as Folder;
engine.PublishingContext.RenderedItem.AddMetadata(folder.Metadata);

I placed the code above in a C# Fragment TBB that I added to a Dynamic Component Template.

When publishing a Page that contains a CP with that Dynamic CT, the following metadata is generated (as XML node) and it is available in the transport package in the component_presentations.xml file:

<ComponentPresentations>
...
    <RenderingMetadata>
      <Metadata xmlns="uuid:5bd95ff5-a509-4490-b37d-211ac5907c53">
      <CDNSettings>
        <CDN>Akamai</CDN>
        <UrlBasePath>http://cdn.akamai.com</UrlBasePath>
      </CDNSettings>
    </RenderingMetadata>
...

Placing the C# TBB on the Page Template will have as effect the generation of RenderingMetadata node in the transport package inside file pages.xml:

<Pages>
...
    <RenderingMetadata>
      <Metadata xmlns="uuid:5bd95ff5-a509-4490-b37d-211ac5907c53">
      <CDNSettings>
        <CDN>Akamai</CDN>
        <UrlBasePath>http://cdn.akamai.com</UrlBasePath>
      </CDNSettings>
    </RenderingMetadata>
...

AddInstruction(InstructionScope scope, XmlElement instruction)

Documentation states: "The instruction parameter can contain any valid XML structure which can be processed at the Content Delivery side. [..] Adding the instruction with a global scope is available at instruction level at the Content Delivery side. When a Page with a PageTemplate is rendered with adding a local instruction this instruction is available on the Page module at the Content Delivery side. When a Component with a ComponentTemplate is rendered with adding a local instruction this instruction available on the Page module when the ComponentPresentation is embedded, in case of a Dynamic ComponentPresenation the added local instruction is available on the ComponentPresentation module at the Content Delivery Side."

Again, the documentation is unclear regarding the "can be processed at the Content Delivery side". Same as with AddMetadata, the instruction XmlElement can be processed during deployment in either the Deployer or a Storage Extension. This information is not stored anywhere after deployment (not in file system or database based Brokers).

The InstructionScope determines where the instruction XML is available in the transport package:

  • InstructionScope.Local

Publishing using the code below will result in the RenderingInstructions node being populated with the folder metadata XML into either the component_presentations.xml or pages.xml in the transport package.

Folder folder = engine.GetObject("tcm:14-6-2") as Folder;
engine.PublishingContext.RenderedItem.AddInstruction(InstructionScope.Local, folder.Metadata);

Adding instruction in a Component Template, will create the following structure in the component_presentations.xml:

<ComponentPresentations>
...
  <ComponentPresentation IsRendered="true">
    <RenderingInstructions>
      <Metadata xmlns="uuid:5bd95ff5-a509-4490-b37d-211ac5907c53">
        <CDNSettings>
          <CDN>Akamai</CDN>
          <UrlBasePath>http://node.akamai.com</UrlBasePath>
        </CDNSettings>
      </Metadata>
    </RenderingInstructions>
...

Adding instruction in a Page Template, will create the following structure in the pages.xml:

<Pages>
...
    <RenderingInstructions>
      <Metadata xmlns="uuid:5bd95ff5-a509-4490-b37d-211ac5907c53">
        <CDNSettings>
          <CDN>Akamai</CDN>
          <UrlBasePath>http://node.akamai.com</UrlBasePath>
        </CDNSettings>
      </Metadata>
    </RenderingInstructions>
...

  • InstructionScope.Global

Using InstructionScope.Global, will generate the following XML structure in the instructions.xml file in the transport package:

<ProcessorInstructions version="6.1.0.996">
...
  <RenderingGlobalInstructions>
    <Metadata xmlns="uuid:5bd95ff5-a509-4490-b37d-211ac5907c53">
      <CDNSettings>
        <CDN>Akamai</CDN>
        <UrlBasePath>http://node.akamai.com</UrlBasePath>
      </CDNSettings>
    </Metadata>
  </RenderingGlobalInstructions>
...

Usage

On its own, this mechanism is really just a half-implementation. It is up to the implementer to read this XML on the Content Delivery side (read Deployer or Storage extension) during publishing. Remember, this won't work during unpublish, as no templates are executed (rendered) during unpublishing. What's really annoying is that the implementer is resposible with writing the code to extract the XmlElement metadata or instruction from their respective files in the transport package!

I would use this mechanism to set, via templating, information that I want to have available during deployment, knowing that this information is not going to be stored on the Content Delivery side.

Most likely I would implement a POJO or in fact a JavaBean and use JAXB to unmarshal the XML into a JavaBean. An alternative is to build an XML DOM and query it with XPath selectors.

Yet Another Java Mediator?

$
0
0
I set out recently on the quest of creating a Java Mediator. I'm definitely not the first one, but I am surprised how far I got in relatively short time (over the weekend, basically between going to pumpkin patches with my daughters) :)

Ok, so it is still work in progress, but this is what I have:
  • Tom.Java - full API conversion of Tom.Net into C# and Java JNI proxies;
  • "Java Fragment" TBB type in the Content Manager;
  • Mediator.Java and its counterpart Mediator.Net proxies that are able to dynamically compile a Java Fragment TBB, dynamically load its .class type, and dynamically run its Transform method;
This is still in very early stages of development / productization, but still I'm very proud with what I accomplished in a relative short time.

So how I did it and what I actually accomplish is next...

General Approach

My goal was to mimic the CM approach for .NET templating. Therefore, I wanted to create a Java Fragment TBB that would hold, well... a Java fragment. This TBB would be executed when placed on a Compound Template by a Mediator, so I went ahead and created a normal .NET Tridion Mediator.

The approach is to use a .NET to Java conversion using a code generation tool, similar to JuggerNET. The .NET Mediator would call the Java proxy that would end up in calling the real Java implementation of the mediator. At the same time, the .NET object needed by the mediator would be passed on to the Java counterpart as 'real' objects. The proxy from .NET to Java would take care of the 2-way communication from .NET object to Java proxy and back to .NET object.

Once in Java context, the Java Fragment source code would be 'injected' into a predefined string representing a Java source class. Then the Java source would be compiled into a .class file. Finally the compiled .class would be loaded using reflection and its method executed. This method would accept the input objects Engine and Package, so the Java Fragment would operate on the actual (yet proxied) .NET objects.

Do you Speak Tom.Java?

Since there was a way to do Java to .NET conversion -- this is what Tridion Content Delivery API does using JuggerNET --, there must be a way to do the opposite (i.e. generate Java proxies from an existing .NET DLL). Googling for "calling java from .net" landed me on the jni4net website (http://jni4net.sourceforge.net/), which seemed to do exactly what I wanted. For short, jni4net is a 2-way proxy from Java to .NET to Java (or the other way around .NET to Java to .NET).

The tricky part was to generate these proxies. I used for that the tool that jni4net provides (i.e. ProxyGen). The idea is to feed into the tool either your C# assemblies or Java JARs in order to generate Java JNI proxies for the assemblies and C# proxies for the JARs.

I used as input the Tridion.ContentManager.*.dll from the [Tridion_Home]\bin\client folder. It took me about 2 days to come up with something worthy, due to some quirks of Proxygen, limitations, known issues, my own learning curve, etc. The final result is 2 files:
  • Tom.Java.Proxy.dll - the C# counter-part of the proxies required by jni4net;
  • Tom.Java.Proxy.jar - the JNI Java proxy classes;
It generated aprox 520 proxy classes from the Tridion DLLs, which represent more or less the entire 'client' Content Manager API. Proxygen has a few limitations, such as it doesn't deal with Generics or does not generate Enums, but other than that is a great tool!

Using Tom.Java one can write calls from .NET into Java passing the Tridion objects as parameters, such as:

C#:
publicstaticvoid Main(string[] args) {
    BridgeSetup bridgeSetup = newBridgeSetup(false) { Verbose = true};
    bridgeSetup.AddClassPath(@"C:\Java Mediator\ProxyGen\lib\jni4net.j-0.8.7.0.jar");
    bridgeSetup.AddClassPath(@"C:\Java Mediator\Tom.Java\dist\Tom.Java.Proxy.jar");

    Bridge.CreateJVM(bridgeSetup);
    Bridge.RegisterAssembly(typeof(Engine_).Assembly);
    Bridge.RegisterAssembly(typeof(MediatorTest).Assembly);

    MediatorTest test = new MediatorTest();
    Session session = newSession();
    test.testPage(session);
}

Java (implementation of MediatorTest class):
import tridion.contentmanager.Session;
import tridion.contentmanager.TcmUri;
import tridion.contentmanager.communicationmanagement.Page;

publicclass MediatorTest {
    publicvoid testPage(Session session) {
        TcmUri pageUri = new TcmUri("tcm:1-2-64");
        Page page = new Page(pageUri, session);
        System.out.println("Page Title: " + page.getTitle());
    }
}

The Mediator Stuff

Once I had Tom.Java generated, writing the Mediator was in fact pretty straight forward. In C#, implement the IMediator interface with its Transform and Configure methods. Then simply call a Java proxy to the 'real' IMediator implementation.

publicclassJavaMediator : IMediator{

    publicvoid Transform(Engineengine, Template template, Package package) {
        BridgeSetupbridgeSetup = newBridgeSetup(false) { Verbose = false};
        bridgeSetup.AddClassPath(@"C:\Java Mediator\ProxyGen\lib\jni4net.j-0.8.7.0.jar");
        bridgeSetup.AddClassPath(@"C:\Java Mediator\Mediator.Java\dist\Mediator.Java.jar");
        bridgeSetup.AddClassPath(@"C:\Java Mediator\Tom.Java\dist\Tom.Java.Proxy.jar");

        Bridge.CreateJVM(bridgeSetup);
        Bridge.RegisterAssembly(typeof(Engine_).Assembly);
        Bridge.RegisterAssembly(typeof(JavaMediatorImpl).Assembly);

        IMediatormediator = newJavaMediatorImpl();
        mediator.Transform(engine, template, package);
    }
}

Funny how powerful these proxies are -- my JavaMediatorImpl class (written in Java) actually implements the same IMediator (well, the proxied interface):

import tridion.contentmanager.communicationmanagement.Template;
import tridion.contentmanager.templating.Engine;
import tridion.contentmanager.templating.IMediator;
import tridion.contentmanager.templating.Package;
import tridion.contentmanager.templating.TemplatingLogger;

publicclassJavaMediatorImpl implements IMediator {

    privateTemplatingLogger log = TemplatingLogger.GetLogger(Engine.typeof());

    publicvoid Transform(Engine engine, Template template, Package _package) {
        log.Debug("Start Mediator.Transform");
        executeTemplate(engine, template, _package);
        log.Debug("Finish Transform");
    }
}

Dynamic Code Execution

In the example above, the implementation of executeTemplate(engine, template, _package) is not given. This is the beef of the actual code execution. This method is responsible for:
  • Putting the Java Fragment in a Java source code class context;
  • Compiling the Java code from String to .class file;
  • Load the .class dynamically and execute, while passing the actual engine and _package proxy object to it;

Put Java Fragment into Class Context

Since Java Fragment TBB is, well... a fragment, it needs to be put in some context in order to execute it. I implemented this context as a 'skeleton' of a class with a placeholder in the middle. This is where the Java Fragment comes in.

My skeleton Java class is initially a String and it looks something like this:

privatestaticfinal String JAVA_FRAGMENT_TBB_PATTERN =
    "package mediator;\r\n" +
    "\r\n" +
    "import tridion.contentmanager.communicationmanagement.*;\r\n" +
    "import tridion.contentmanager.contentmanagement.*;\r\n" +
    "import tridion.contentmanager.templating.*;\r\n" +
    "\r\n" +
    "public class JavaFragmentTBB {\r\n" +
    "\r\n" +
    "    private TemplatingLogger log = TemplatingLogger.GetLogger(null);\r\n" +
    "\r\n" +
    "    public void execute(Engine _engine, tridion.contentmanager.templating.Package _package) {\r\n" +
    "        Engine engine = _engine;\r\n" +
    "        Engine _e = _engine;\r\n" +
    "        tridion.contentmanager.templating.Package _p = _package;\r\n" +
    "%s\r\n" +
    "    }\r\n" +
    "}";

The placeholder %s gets formatted (replaced) with the actual Java Fragment template:

String javaSource = String.format(JAVA_FRAGMENT_TBB_PATTERN, template.getContent());
SourceStringCompiler compiler = new SourceStringCompiler("mediator.JavaFragmentTBB", javaSource);

Compile .class File Dynamically

The SourceStringCompiler mentioned above is a class that takes a className and its Java source code as parameters and compiles it into .class file on the file system.

I made use of the Java Compiler API in javax.tools for that.

JavaCompiler compiler = ToolProvider.getSystemJavaCompiler();
StandardJavaFileManager fileManager = compiler.getStandardFileManager(null, null, null);

JavaFileObject javaObjectFromString = getJavaFileContentsAsString(className, javaSource);
Iterable<JavaFileObject> fileObjects = Arrays.asList(javaObjectFromString);
Iterable<String> options = Arrays.asList("-d", "C:\\Java Mediator\\Classes");

CompilationTask task = compiler.getTask(null, fileManager, null, options, null, fileObjects);
Boolean result = task.call();

if (result) {
    log.Debug("Compilation succeeded");
} else {
    log.Error("Compilation failed\r\n" + message);
    thrownew RuntimeException("Compilation failed\r\n" + message);
}

I like the simplicity of the code above and the fact that I don't need to deal with creating my own javac execution and batch file. The result of running the CompilationTask.call() is still that the javac is called behind the scenes and that a file .class is generated on the file system.

If the compilation fails, I throw an exception to let the calling code (.NET) know that something was wrong with the actual execution of the Java Fragment. This exception is propagated all the way back into the rendering engine (and it will also show up in TemplateBuilder).

A peculiar thing is about the execution performance -- the first time it takes a few seconds to instantiate the Compiler, but then every subsequent execution (within the same JVM) takes significantly less.

Execute .class Dynamically

Finally, I need to load the .class file dynamically and execute its execute() method with the parameters Engine and Package. Again, the beauty of these JNI proxies is perceivable here -- I am able to pass them in as parameters to this dynamically generated, dynamically compiled class. Still, any modification operated to the Engine and Package objects will be directly reflected back into the 'real' objects in .NET, since we are in fact dealing with proxies to the actual objects.

I used Reflection API to load the class dynamically:

File classesDir = new File("C:\\Java Mediator\\Classes");
ClassLoader parentLoader = JavaMediatorImpl.class.getClassLoader();
URLClassLoader myClassLoader = new URLClassLoader(new URL[] { classesDir.toURI().toURL() }, parentLoader);
Class<?> myClass = myClassLoader.loadClass("mediator.JavaFragmentTBB");

Instantiate the class, get the execute() method and call it with Engine and Package object parameters:

Object theInstance = myClass.newInstance();
Method myMethod = myClass.getMethod("execute", Engine.class, Package.class);
myMethod.invoke(theInstance, new Object[] { engine, _package });

It's Show Time, Dim the Lights!

I created a Java Fragment TBB as shown in the screenshot below. I placed it in a Page Template and then executed it on a Page in TemplateBuilder.
Note: the usage of ContentType -- remember, the ProxyGen limitation on Enums (or any other read-only, const fields)? There is no ContentType.Text available, hence I have to create my own. Behind the scenes, ContentType.Text is in fact a ContentType with mime-type parameter 'text/plain'.

Running the Page Template with the Java Fragment in it in TemplateBuilder will yield the following output:
Note:
  • 2 items added to the Package and their mime-types;
  • Execution Time is consistently at 0.1sec on every subsequent run (first run was ~6sec). Part of the first time execution time is the instantiation of the JVM inside jni4net, loading the JARs and Assemblies, etc. Interesting to notice is however that I didn't implement any caching -- i.e. the .class file is compiled evey time the TBB executes; also, the new .class is loaded and executed with every single TBB execution;
  • Debug information showing in the Output panel;
Known (current) limitation: Page.ComponentPresentations returns IList<ComponentPresentation> and the proxies generated by Proxygen cannot deal with generics. Hence, calling page.getComponentPresentations() on the Java proxy throws a very cryptic InvocationTargetException.

Looking to the (Near) Future

Next steps of Research & Development include:
  • support for JAR binary TBBs (I heard some horror stories there) with the ability to run individual pre-compiled "ITemplate" Java classes inside of it;
  • JSP/JSTL replacement of the so passionately disliked Dreamweaver TBBs. The idea is to have a JSP-like TBB that accepts JSP/JSTL syntax in order to generate the "Output" package item;
  • Code handling for both JSP/JSTL and Java Fragment TBBs (code syntax validation, compilation warning/errors, source code formatting, etc...);
Overall, I'm excited by the potential of this solution. I see it as very cool usage of technologies and it will definitely fill a niche in the market. I shall continue posting updates on this little R&D project. Looking forward to any comments, questions, remarks...

If you liked it so far, check out my next post about Java Fragment validation.


Syntax Validation for Java Fragment TBB

$
0
0
This little post explains how to perform "Content Validation" of a Java Fragment TBB as part of the Tridion Java Mediator I am currently working on. If you want more information about the "Yet Another Java Mediator" for Tridion templating, have a look at my previous post.

Content validation for TBBs, or in Tridion terms, a Template Content Handler, is a piece of code that runs when a TBB is read (open) or written (saved) and it serves two purposes:
  • perform any validation of the TBB content;
  • replace any references to other Tridion items (either by TCM URI or WebDAV URL, or some kind of reference) with some kind of substitution;
I have not implemented any replacement logic yet (it is to come in the near future when I'll write the logic to upload JARs and execute classes from it). So far my goal was to perform a compile of the TBB Java source fragment at the moment of saving the TBB, and in case of errors, show the errors in the GUI and prevent saving. This is the same behaviour the C# Fragment TBBs have in OOTB Tridion.

In .NET things are simple -- all I had to do was create a class that extends abstract Tridion.ContentManager.Templating.AbstractTemplateContentHandler and implement method PerformValidateContent. From this method I called the Java Fragment compile logic (written in Java, over JNI proxy, as described in my previous post).

C#:
publicclassJavaFragmentContentHandler : AbstractTemplateContentHandler {

publicoverridevoid PerformValidateContent() {
    JavaTemplateHandlerhandler = newJavaTemplateHandler();
    handler.compile(base.Content);
}

Java:
publicclass JavaTemplateHandler {

publicvoid compile(String javaFragmentSource) {
    String javaSource = String.format(JAVA_FRAGMENT_SKELETON_SOURCE, javaFragmentSource);
    SourceStringCompiler compiler = new SourceStringCompiler(CLASSES_DIR);
    compiler.compile(JAVA_FRAGMENT_SKELETON_CLASS_NAME, javaSource);
}

On the Java side, I am simply intercepting the messages coming from the Compiler API and, in case of an error, throw an exception with the compiler message formatted a bit. I do some formatting on the line numbers only, so that the line number reported is actually relative to the Java Fragment that is shown in Tridion and not the line number relative to the entire compilation unit. The thrown exception is propagated all the way back into .NET and eventually ends up being displayed in the CME.

I hit a few roadblocks, of course:
  • Content handlers need to be registered in the GAC. This is unfortunately inconsistent with the mechanism of registering mediators, where you can specify the actual assembly path. As a Java head, I really dislike the GAC -- it is the root of all evil. This meant I had to sign all my referenced DLLs (including Tom.Java, jni4net). This meant a lot of trouble with the JAR loading mechanism from jni4net etc). In the end I opted for reflection, just to keep the 'referenced' DLLs out of GAC and unsigned;
  • 32bit vs 64bit trouble -- some parts of Tridion CM run as 32bit applications and some as 64bit. This poses a huge problem to jni4net Bridge, which has to use the appropriate 32 vs 64 bit Java Development Kit based on the calling .NET CLR it runs it;
Finally, everything worked and the validation looks something like that:
Notice the variable "a" declared without a type and the Date class not being resolvable.

A detail error message looks like this, if expanded:
So far I have the entire Java Fragment TBB done, including validation/compilation. Next, I I'll focus on the JSP/JSTL TBBs that I intend to use as replacement of Dreamweaver TBBs.


JSP Template Building Block

$
0
0
Continuing on my quest for creating a Java Mediator for Tridion templating, I have now reached the moment when I would focus on a JSP TBB. Check out my previous post about Java Fragment TBB and how to validate its syntax/compilation capability.

The JSP TBB is meant to provide the same functionality as a Dreamweaver TBB, only in Java/JSP technology. So I had in mind the following requirements:
  • JSP support (write the TBB in JSP syntax);
  • compile JSP to Java source file;
  • compile Java source to byte-code;
  • execute Java compiled class in the context of Engine and Package objects;
  • create "Output" package item containing the result of JSP execution;
The following is a sample of JSP Layout TBB for a Page Template.
Note:
  • The objects Engine and Package are available in the 'request';
  • There is no real 'request' object as the entire JSP code-above does not run in or require a J2EE container -- the request is a FakeHttpServletRequest, but more on that below...
  • Page object is called _page, as there is already a "page" variable defined in the HttpJspBase of this JSP source class;

General Approach

  • The JSP Layout TBB type is handled by the mediator class (.net) JspJstlMediator, as defined in the Tridion.ContentManager.config file;
  • The .NET mediator object calls the Java proxy JspJstlMediatorImpl.Transform(), which performs the actions:
    • create JSP file on file-system;
    • compile JSP to Java source;
    • compile Java source to Java Servlet .class;
    • execute the servlet while injecting the Engine and Package JNI proxy objects into the request as attributes;
  • Output package item is created and it contains the output generated by the execution of the servlet class;

JspJstlMediator.cs

publicclassJspJstlMediator : IMediator{

    publicvoid Transform(Engineengine, Template template, Package package) {
        Utils.SetupJni4NetBridge();
        IMediatormediator = newJspJstlMediatorImpl();
        mediator.Transform(engine, template, package);
    }
}

The Utils.SetupJni4NetBridge() is a utility method that simply creates the Jni4Net Bridge, as described in my previous post Yet Another Java Mediator.

JspJstlMediatorImpl.java

This class simply dispatches to the template handler for JSP generation, compilation and execution.

publicclass JspJstlMediatorImpl implements IMediator {

    publicvoid Transform(Engine engine, Template template, Package _package) {
        JspJstlTemplateHandler handler = new JspJstlTemplateHandler(template);
        handler.createJsp();
        handler.compileJsp();
        handler.compileJava();
        handler.execute(engine, _package);
    }
}

JspJstlTemplateHandler.java

This class implements the entire logic of handling a given Template. The constructors are shown below:

public JspJstlTemplateHandler(String name, String content, long lastModifiedTicks) {
    this.name = name;
    this.content = content;
    this.lastModifiedTicks = lastModifiedTicks;
    //... other initializations
}

public JspJstlTemplateHandler(Template template) {
    this(template.getTitle(),
        template.getContent(),
        template.getRevisionDate().ToUniversalTime().getTicks());
}

Create JSP File

The sole purpose of this method is to create the JSP file on the file system. So it takes the content of the Template and writes it into a file in a certain location on the file system. Several parts of this methods are missing, e.g. exception handling or cache handling based on Template LastRevisionDate and file LastModified date.

publicvoidcreateJsp() {
    try{
        BufferedWriter out = new BufferedWriter(newFileWriter(jspFileName));
        out.write(content);
        out.close();
    } catch(IOException ioe) {
        // handle exception
    }
}

Compile JSP to Java Source File

The biggest challenge for me was to find a way to generate the Java Servlet source code representing the JSP itself and then execute it outside a J2EE container! I could have opted for the presence of a Tomcat instance on the CM server and I would just drop the JSP in one of its web-application roots. I thought however that that would impose some overhead that I don't want to deal with (e.g. installation of Tomcat, configuring a web-application).

Instead I opted for an out-of-container JSP compilation and JSP/Java Servlet execution. I will explain the execution part further-down in this post. Let's focus now on the JSP to Java source 'compilation'.

I used  Jasper2 from Tomcat 6 for converting JSP files to Java sources. There is also a shell class for Jasper compiler in class org.apache.jasper.JspC that can be used to pre-compile JSP files. This class is in fact a Java Application, so it can run from command line, thus outside of a J2EE container -- exactly what I wanted.

Using JspC is in fact very simple due to its versatile parameters that can be passed to it. In my case, the sample for generating JSP to Java source is the following:

publicvoidcompileJsp() {
    JspC jspc = new JspC();
    try{
        String[] jspcArgs = new String[] { "-uriroot"jspDir, "-d", jspSrcDir, jspFileName };
        jspc.setArgs(jspcArgs);
        jspc.execute();
    } catch(Exception e) {
        // handle exception
    }
}

Notice the three parameters I'm sending to JspC are:
  • -uriroot - the directory where the root of my web-application (i.e. I don't have one, so I just set parent folder where the JSP resides; I also don't have a web.xml or a WEB-INF folder);
  • -d - the directory where to generate the Java sources under;
  • the actual JSP file location to 'compile';

Compile Java Source to Byte-Code

JspC can also handle the actual compilation of the generated Java source into a byte-code .class file by simply supplying a -compile argument. I chose however not to use this option due to its overhead in requiring several additional JAR files (e.g. ant.jar, ant-launcher.jar, etc).

Instead I opted for using the out-of-the-box Java Compiler API that I already had used for the Java Fragment TBBs. The code is very similar to the previous, only that the file to compile comes from a real File on the file-system and not from a String:

publicvoidcompileJava() {
    File javaFile = new File(javaFileName);
    try{
        SourceStringCompiler compiler = new SourceStringCompiler(jspSrcDir);
        compiler.compile(javaFile);
    } catch(RuntimeException re) {
        // handle exception
    }
}

Execute JSP Outside the J2EE Container

Executing the .class file follows the same methodology described in my earlier post -- i.e. reflection. However the twist here is that I'm in fact executing the service(HttpServletRequest, HttpServletResponse) method of a Java Servlet that represents my JSP.

As I said previously, I don't have an Application Server to execute my JSP in. Therefore, I had to fake all the objects that I would normally receive from a J2EE container -- e.g. request, response, servlet context and config, etc.

Once I had all the necessary objects, I was able to invoke the service() method using reflection and it would execute the Servlet, as if it were executing inside a J2EE container. Of course there are some things that simply don't exist and I excluded them from the 'fake' classes. For example, the 'application' or 'session' scope, session itself, many parameters of the request and response objects, etc.

publicvoidexecute(Engine engine, Package _package) {
    try{
        File jspSrcDirFile = new File(jspSrcDir);
        ClassLoader parentLoader = JspJstlTemplateHandler.class.getClassLoader();

        URLClassLoader classLoader = new URLClassLoader(
                newURL[] { jspSrcDirFile.toURI().toURL() },
                parentLoader);
        Class<?> jspJavaClass = classLoader.loadClass("org.apache.jsp." + javaName + "_jsp");
        HttpJspBase jspPage = (HttpJspBase) jspJavaClass.newInstance();

        StringWriter outputWriter = new StringWriter();
        FakeJspFactory jspFactory = new FakeJspFactory(outputWriter);
        JspFactory.setDefaultFactory(jspFactory);

        HttpServletRequest request = new FakeHttpServletRequest();
        HttpServletResponse response = new FakeHttpServletResponse();

        request.setAttribute("engine", engine);
        request.setAttribute("package", _package);

        jspPage.init(new FakeServletConfig());
        jspPage.service(request, response);

        _package.PushItem("Output", _package.CreateHtmlItem(outputWriter.toString()));
    } catch(Exception e) {
        // handle exception
    }
}

Note:
  • FakeJspFactory that takes as argument a StringWriter. This is where the JSP execution output will be written to, hence I'm using this writer to create the Output package item holding the result of this TBB's execution;
  • Fake request/response objects needed to pass to the service() method;
  • FakeServletConfig that initializes the JSP Servlet -- needed for things like Expression Evaluator (in EL), tag handling, etc. (but more about tag support in a following post);

The Fake Stuff

In order to execute a Java Servlet without a J2EE container, I have to provide fake classes that the servlet  directly uses, that the container would normally provide. Uncle Bob gives a great explanation of his 'mock' implementation classes.

I implemented the following 'fake' classes:

FakeJspFactory.java

This classes serves two purposes:
  • defines a constructor that takes a StringWriter that will contain the entire execution output;
  • creates a FakePageContext that is represents the entry point into the entire 'fake' world that mimics the J2EE container;
publicclass FakeJspFactory extends JspFactory {

    private StringWriter outputWriter;

    public FakeJspFactory(StringWriter outputWriter) {
        this.outputWriter = outputWriter;
    }

    public PageContext getPageContext(Servlet servlet, ServletRequest servletRequest, ServletResponse servletResponse,
            String string, boolean b, int i, boolean b1) {
        FakeJspWriter jspWriter = new FakeJspWriter(outputWriter, i, b1);
        returnnew FakePageContext(jspWriter, (HttpServletRequest) servletRequest,
                (HttpServletResponse) servletResponse);
    }
}

FakePageContext.java

This is the main class that provides all other 'fake' objects to the calling JSP Servlet.

publicclassFakePageContext extends PageContext {

    privatetransient HashMap<String, Object> attributes;
    privatefinal JspWriter out;
    privateHttpServletRequest request;
    privateHttpServletResponse response;
    privateServletContext servletContext;
    privateJspApplicationContextImpl applicationContext;
    privateELContextImpl elContext;

    publicFakePageContext(JspWriter out, HttpServletRequest request, HttpServletResponse response) {
        this.out = out;
        this.request = request;
        this.response = response;
        attributes= new HashMap<String, Object>(16);
    }

    publicJspWriter getOut() {
        returnout;
    }

    publicServletRequest getRequest() {
        returnrequest;
    }

    publicServletResponse getResponse() {
        returnresponse;
    }

    publicServletContext getServletContext() {
        if(servletContext == null) {
            servletContext= new FakeServletContext();
        }
        returnservletContext;
    }

    publicObject findAttribute(String name) {
        Object result = getAttribute(name);
        if(result == null) {
            result = getAttribute(name, REQUEST_SCOPE);
        }
        returnresult;
    }

    publicObject getAttribute(String name) {
        returnattributes.get(name);
    }

    publicObject getAttribute(String name, intscope) {
        switch(scope) {
            casePAGE_SCOPE:
                returnattributes.get(name);

            caseREQUEST_SCOPE:
                returnrequest.getAttribute(name);

            default:
                thrownewIllegalArgumentException("Invalid scope");
        }
    }

    publicint getAttributesScope(String name) {
        if(getAttribute(name) != null) {
            returnPAGE_SCOPE;
        }
        if(getAttribute(name, REQUEST_SCOPE) != null) {
            returnREQUEST_SCOPE;
        }
        return0;
    }

    publicvoid removeAttribute(String name) {
        removeAttribute(name, PAGE_SCOPE);
        removeAttribute(name, REQUEST_SCOPE);
    }

    publicvoid removeAttribute(String name, int scope) {
        switch(scope) {
            casePAGE_SCOPE:
                attributes.remove(name);
                break;

            caseREQUEST_SCOPE:
                request.removeAttribute(name);
                break;

            default:
                thrownewIllegalArgumentException("Invalid scope");
        }
    }

    publicvoid setAttribute(String name, Object attribute) {
        if(attribute != null) {
            attributes.put(name, attribute);
        } else{
            attributes.remove(name);
        }
    }

    publicvoid setAttribute(String name, Object o, int scope) {
        if(o != null) {
            switch(scope) {
                casePAGE_SCOPE:
                    attributes.put(name, o);
                    break;

                caseREQUEST_SCOPE:
                    request.setAttribute(name, o);
                    break;

                default:
                    thrownew IllegalArgumentException("Invalid scope");
            }
        } else{
            removeAttribute(name, scope);
        }
    }
}

FakeServletConfig.java

publicclass FakeServletConfig implements ServletConfig {

    privatestatic ServletConfig instance;
    private ServletContext servletContext;

    private FakeServletConfig(ServletContext servletContext) {
        this.servletContext = servletContext;
    }

     public static ServletConfig getInstance() {
        if (instance == null) {
            instance = new FakeServletConfig(FakeServletContext.getInstance());
        }
        return instance;
    }

    public ServletContext getServletContext() {
        returnservletContext;
    }
 }

FakeServletContext.java

publicclass FakeServletContext implements ServletContext {

    privatestatic ServletContext instance;
    privatetransient HashMap<String, Object> attributes = new HashMap<String, Object>(16);

    public static ServletContext getInstance() {
        if (instance == null) {
            instance = new FakeServletContext();
        }
        return instance;
    }

    public Object getAttribute(String name) {
        returnattributes.get(name);
    }

    publicvoid setAttribute(String name, Object value) {
        attributes.put(name, value);
    }

    publicvoid removeAttribute(String name) {
        attributes.remove(name);
    }

    public ServletContext getContext(String s) {
        returninstance;
    }
}

FakeHttpServletRequest.java

publicclass FakeHttpServletRequest implements HttpServletRequest {

    private Map<String, String> parameters = new TreeMap<String, String>();
    private Map<String, Object> attributes = new TreeMap<String, Object>();

    publicvoid setAttribute(String name, Object attribute) {
        attributes.put(name, attribute);
    }

    public Object getAttribute(String name) {
        returnattributes.get(name);
    }

    publicvoid removeAttribute(String name) {
        attributes.remove(name);
    }

    publicvoid setParameter(String name, String value) {
        parameters.put(name, value);
    }

    public String getParameter(String name) {
        returnparameters.get(name);
    }

    public Map getParameterMap() {
        returnparameters;
    }
}

FakeHttpServletResponse.java

All methods in this class have default empty implementation.

Next Steps

  • Implement a Template Content Handler for JSP Layout TBBs. This would perform a compilation of the JSP code and display any potential compilation errors in the Message Center of Tridion CME;
  • Implement JSTL support, such that JSTL tags and Expression Language constructs work in JSP TBBs - e.g. <c:out value="${pageTitle}"/>;
  • Tridion Object Model Bean-ification, so that one can navigate the TOM using EL such as ${component.fields.paragraph[0].bodytext};


Viewing all 215 articles
Browse latest View live


Latest Images