Q & A. IV

This guy asked me through linkedin.

I saw you in EMC community and wanted your advice on xCP 2.2 workflows. We have xCP 1.6 workflows and currently planning to re-design them to xCP 2.2 and having below queries ..

1) We have a custom document object type called cus_enterpirse which inherits from dm_document. We are using this type as package in current xCP 1.6 workflow. Can we use the same object type in xCP 2.2 environment (like adopting cus_enterprise ) for workflows ?

2) Will there be any conflicts/issues for existing workflows in xCP 1.6 environment if we adopt the custom type in xCP designer and install the application using xCP 2.2 environment ? Also we wanted to run both xCP 1.6 and xCP 2.2 workflows parallelly for same data model as packages, Will this have any impact

3) We heard that we cannot use the same data model in xCP 1.6 and xCP 2.2 environment and we need to create new data model for workflows and then use it in xCP 2.2 workflows. Is that correct?

Sukumar, first of all, you seem to be a very good EMC customer – once you already bought a stillborn product (xCP), now you are taking a part in beta-testing (xCP2), and in my opinion EMC support must make every effort to help you in your project 🙂

Anyway, type adoption feature does work (we were able to make xCP2 work with composer types and webtop work with xCP2 workflows) but there are some glitches you would like to know:

  • xCP Designer behaves very strange with adopted types: removing adopted type from project and further installing this project causes removing adopted type from repository – be careful with this “feature” (yeap, EMC considers this bug as feature request)
  • interoperability between xCP2 and other applications, which was promised by EMC in xCP2.1, was nothing but fail:
  • Yes, xCP2 workflows use another model (no aliassets, no custom methods, process variables are stored differently), technically this means that you should either dot not work with workflows through old application (for example, xCP2 does display tasks only for it’s workflows) or implement custom adapter, for example below is our utility class that allows to work with different models of process variables (it assumes that there is some naming convention for process variables):
    /**
     * @author Andrey B. Panfilov <andrew@panfilov.tel>
     */
    public class ProcessVariableUtils {
    
        public ProcessVariableUtils() {
            super();
        }
    
        public static Object getValue(IDfWorkitemEx workitem, String variableName)
            throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                return getProcessVariableValueInternal(workitem, variableName);
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    return getProcessVariableValueInternal(workitem, candidate);
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            return workitem.getStructuredDataTypeAttrValue(setting, parameter);
        }
    
        private static Object getProcessVariableValueInternal(
                IDfWorkitemEx workitem, String variableName) throws DfException {
            return getNonProxied(workitem).getPrimitiveVariableValue(variableName);
        }
    
        public static Object getValue(IDfWorkflowEx workflow, String variableName)
            throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                return getProcessVariableValueInternal(workflow, variableName);
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    return getProcessVariableValueInternal(workflow, candidate);
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            return workflow.getStructuredDataTypeAttrValue(setting, parameter);
        }
    
        private static Object getProcessVariableValueInternal(
                IDfWorkflowEx workflow, String variableName) throws DfException {
            return getNonProxied(workflow).getPrimitiveVariableValue(variableName);
        }
    
        public static void setValue(IDfWorkitemEx workitem, String variableName,
                Object value) throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                setProcessVariableValueInternal(workitem, variableName, value);
                return;
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    setProcessVariableValueInternal(workitem, candidate, value);
                    return;
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            workitem.setStructuredDataTypeAttrValue(setting, parameter, value);
        }
    
        private static void setProcessVariableValueInternal(IDfWorkitemEx workitem,
                String variableName, Object value) throws DfException {
            getNonProxied(workitem).setPrimitiveObjectValue(variableName, value);
        }
    
        public static void setValue(IDfWorkflowEx workflow, String variableName,
                Object value) throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                setProcessVariableValueInternal(workflow, variableName, value);
                return;
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    setProcessVariableValueInternal(workflow, candidate, value);
                    return;
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            workflow.setStructuredDataTypeAttrValue(setting, parameter, value);
        }
    
        private static void setProcessVariableValueInternal(IDfWorkflowEx workflow,
                String variableName, Object value) throws DfException {
            getNonProxied(workflow).setPrimitiveObjectValue(variableName, value);
        }
    
        public static Object[] getValues(IDfWorkitemEx workitem, String variableName)
            throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                return getProcessVariableValuesInternal(workitem, variableName);
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    return getProcessVariableValuesInternal(workitem, candidate);
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            return workitem.getStructuredDataTypeAttrValues(setting, parameter);
        }
    
        private static Object[] getProcessVariableValuesInternal(
                IDfWorkitemEx workitem, String variableName) throws DfException {
            try {
                workitem = getNonProxied(workitem);
                Method method = ReflectionUtils.getMethod(workitem.getClass(),
                        "getRepeatingPrimitiveVariableValues", String.class);
                if (method != null) {
                    List result = (List) method.invoke(workitem, variableName);
                    if (result == null) {
                        return new Object[] {};
                    }
                    return result.toArray(new Object[result.size()]);
                }
                throw new ProcessVariableNotFoundException(
                        "getRepeatingPrimitiveVariableValues", variableName);
            } catch (IllegalAccessException ex) {
                throw new RuntimeException(ex);
            } catch (InvocationTargetException ex) {
                Throwable t = ex.getTargetException();
                if (t instanceof DfException) {
                    throw (DfException) t;
                }
                throw new RuntimeException(ex);
            }
        }
    
        public static Object[] getValues(IDfWorkflowEx workflow, String variableName)
            throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                return getProcessVariableValuesInternal(workflow, variableName);
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    return getProcessVariableValuesInternal(workflow, candidate);
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            return workflow.getStructuredDataTypeAttrValues(setting, parameter);
        }
    
        private static Object[] getProcessVariableValuesInternal(
                IDfWorkflowEx workflow, String variableName) throws DfException {
            try {
                workflow = getNonProxied(workflow);
                Method method = ReflectionUtils.getMethod(workflow.getClass(),
                        "getRepeatingPrimitiveVariableValues", String.class);
                if (method != null) {
                    List result = (List) method.invoke(workflow, variableName);
                    if (result == null) {
                        return new Object[] {};
                    }
                    return result.toArray(new Object[result.size()]);
                }
                throw new ProcessVariableNotFoundException(
                        "getRepeatingPrimitiveVariableValues", variableName);
            } catch (IllegalAccessException ex) {
                throw new RuntimeException(ex);
            } catch (InvocationTargetException ex) {
                Throwable t = ex.getTargetException();
                if (t instanceof DfException) {
                    throw (DfException) t;
                }
                throw new RuntimeException(ex);
            }
        }
    
        public static void setValues(IDfWorkitemEx workitem, String variableName,
                Object[] values) throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                setProcessVariableValuesInternal(workitem, variableName, values);
                return;
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    setProcessVariableValuesInternal(workitem, candidate, values);
                    return;
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            workitem.setStructuredDataTypeAttrValues(setting, parameter, values);
        }
    
        private static void setProcessVariableValuesInternal(
                IDfWorkitemEx workitem, String variableName, Object[] values)
            throws DfException {
            try {
                workitem = getNonProxied(workitem);
                Method method = ReflectionUtils.getMethod(workitem.getClass(),
                        "setRepeatingPrimitiveObjectValues", String.class,
                        List.class);
                if (method != null) {
                    method.invoke(workitem, variableName, Arrays.asList(values));
                    return;
                }
                throw new ProcessVariableNotFoundException(
                        "setRepeatingPrimitiveObjectValues", variableName);
            } catch (IllegalAccessException ex) {
                throw new RuntimeException(ex);
            } catch (InvocationTargetException ex) {
                Throwable t = ex.getTargetException();
                if (t instanceof DfException) {
                    throw (DfException) t;
                }
                throw new RuntimeException(ex);
            }
        }
    
        public static void setValues(IDfWorkflowEx workflow, String variableName,
                Object[] values) throws DfException {
            int dotIndex = variableName.indexOf(".");
            if (dotIndex < 0) {
                setProcessVariableValuesInternal(workflow, variableName, values);
                return;
            }
            for (String candidate : structuredToPrimitive(variableName)) {
                try {
                    setProcessVariableValuesInternal(workflow, candidate, values);
                    return;
                } catch (ProcessVariableNotFoundException ex) {
                    // pass
                } catch (ProcessVariableTypeNullIdException ex) {
                    // pass
                }
            }
            String setting = variableName.substring(0, dotIndex);
            String parameter = variableName.substring(dotIndex + 1,
                    variableName.length());
            workflow.setStructuredDataTypeAttrValues(setting, parameter, values);
        }
    
        private static void setProcessVariableValuesInternal(
                IDfWorkflowEx workflow, String variableName, Object[] values)
            throws DfException {
            try {
                workflow = getNonProxied(workflow);
                Method method = ReflectionUtils.getMethod(workflow.getClass(),
                        "setRepeatingPrimitiveObjectValues", String.class,
                        List.class);
                if (method != null) {
                    method.invoke(workflow, variableName, Arrays.asList(values));
                    return;
                }
                throw new ProcessVariableNotFoundException(
                        "setRepeatingPrimitiveObjectValues", variableName);
            } catch (IllegalAccessException ex) {
                throw new RuntimeException(ex);
            } catch (InvocationTargetException ex) {
                Throwable t = ex.getTargetException();
                if (t instanceof DfException) {
                    throw (DfException) t;
                }
                throw new RuntimeException(ex);
            }
        }
    
        private static List<String> structuredToPrimitive(String variableName) {
            List<String> candidates = new ArrayList<String>(2);
            candidates.add(variableName.replace('.', '_').replace('/', '_'));
            candidates.add(variableName.replace('.', '_').replace('/', '_')
                    .toLowerCase());
            return candidates;
        }
    
        private static IDfWorkitemEx getNonProxied(IDfWorkitemEx workitemEx) {
            if (workitemEx instanceof IDfWorkitemExWrapper) {
                workitemEx = (IDfWorkitemEx) ((IDfWorkitemExWrapper) workitemEx)
                        .getWrappedObject();
            }
            if (workitemEx instanceof IProxyHandler) {
                return (IDfWorkitemEx) ((IProxyHandler) workitemEx)
                        .____getImp____();
            }
            return workitemEx;
        }
    
        private static IDfWorkflowEx getNonProxied(IDfWorkflowEx workflowEx) {
            if (workflowEx instanceof IProxyHandler) {
                return (IDfWorkflowEx) ((IProxyHandler) workflowEx)
                        .____getImp____();
            }
            return workflowEx;
        }
    
    }
    
    

Q&A. III

Hello , I very much love your post. I need your suggestion to current problem in my project.

We have j2ee web application and it is using DFC API to upload/download documents. Currently, we are creating a single session manager object for entire application and same will be used while creating a session but when multiple users are trying to upload documents at time than we could not creation a session and instant all threads are blocked and 1 thread is waiting to get a connection.

I’m planing to create a sessionmanager for each thread and create session from sessionmanager for each thread. Is this a correct approach? Please let us know if there is any better approach to resolve this issue.

thank you
Reddy.

Sankar, though your question is related to the final post (still unpublished 😦 ) of “Session management” series, I will share my ideas in this post. When we are thinking about how to create sessions to Content Server we should consider following problems:

  1. Session creation takes a relatively long time
  2. Sharing sessions among thread causes performance degradation
  3. A lot of opened session could overload Content Server

and there is no common approach which would be able to solve all three above problems simultaneously – everything depends on design of your application. For example, I think that the code below is the best option you can use without binding to application level:

IDfSession session = null;
IDfSessionManager sessionManager = null;
try {
    // creating brand new session manager
    sessionManager = _dfClient.newSessionManager();
    IDfLoginInfo loginInfo = new DfLoginInfo(_userName, _password);
    // preventing DFC from sending useless LOGIN RPC
    loginInfo.setForceAuthentication(false);
    sessionManager.setIdentity(_docbase, loginInfo);
    // creating brand new session
    session = sessionManager.getSession(_docbase);
 
    // some logic
 
} finally {
    if (session != null) {
        sessionManager.release(session);
        // forcibly returning session to session pool
        sessionManager.flushSessions();
    }
}

But in your case you’re playing on application level, so the possible approaches may vary, for example, you can leverage ThreadLocal capabilities to make some things more optimal, like:

ThreadLocal storage:

/**
 * @author Andrey B. Panfilov <andrew@panfilov.tel>
 */
public final class ThreadLocalStorage {

    private static final ThreadLocalStorage INSTANCE = new ThreadLocalStorage();

    private final ThreadLocal<Map<Object, Object>> _cache = new ThreadLocal<Map<Object, Object>>();

    private volatile boolean _active;

    private ThreadLocalStorage() {
        super();
    }

    public static boolean isActive() {
        return getInstance()._active;
    }

    public static void setActive(boolean active) {
        getInstance()._active = active;
    }

    private static ThreadLocalStorage getInstance() {
        return INSTANCE;
    }

    public static void clear() {
        Map cache = getCache(false);
        if (cache == null) {
            return;
        }
        for (Object value : cache.values()) {
            closeSilently(value);
        }
        cache.clear();
    }

    public static Object get(Object key) {
        Map cache = getCache(false);
        if (cache == null) {
            return null;
        }
        return cache.get(key);
    }

    public static void put(Object key, Object object) {
        Map<Object, Object> cache = getCache();
        cache.put(key, object);
    }

    public static void remove(Object key) {
        Map<Object, Object> cache = getCache();
        closeSilently(cache.remove(key));
    }

    public static boolean containsKey(String key) {
        Map<Object, Object> cache = getCache();
        return cache.containsKey(key);
    }

    private static Map<Object, Object> getCache() {
        return getCache(true);
    }

    private static Map<Object, Object> getCache(boolean create) {
        Map<Object, Object> cache = getInstance()._cache.get();
        if (create && cache == null) {
            cache = new HashMap<Object, Object>();
            getInstance()._cache.set(cache);
        }
        return cache;
    }

    private static void closeSilently(Object closeable) {
        try {
            if (!(closeable instanceof Closeable)) {
                return;
            }
            ((Closeable) closeable).close();
        } catch (IOException ex) {
            // ignore
        }
    }

}

Web filter, which maintains ThreadLocalStorage:

/**
 * @author Andrey B. Panfilov <andrew@panfilov.tel>
 */
public class ThreadLocalCacheFilter implements Filter {

    public ThreadLocalCacheFilter() {
        super();
    }

    @Override
    public void doFilter(ServletRequest request, ServletResponse response,
            FilterChain chain) throws IOException, ServletException {
        try {
            chain.doFilter(request, response);
        } finally {
            ThreadLocalStorage.clear();
        }
    }

    @Override
    public void destroy() {
        ThreadLocalStorage.setActive(false);
    }

    @Override
    public void init(FilterConfig filterConfig) throws ServletException {
        ThreadLocalStorage.setActive(true);
    }

}

SessionManager helper:

/**
 * @author Andrey B. Panfilov <andrew@panfilov.tel>
 */
public final class SessionManagerHelper {

    private SessionManagerHelper() {
        super();
    }

    public static IDfSession getSession(String docbase) throws DfException {
        return getSessionManager(docbase).getSession(docbase);
    }

    public static IDfSession newSession(String docbase) throws DfException {
        return getSessionManager(docbase).newSession(docbase);
    }

    public static void release(IDfSession session) throws DfException {
        getSessionManager(session.getDocbaseName()).release(session);
    }

    private static IDfSessionManager getSessionManager(String docbase)
        throws DfException {
        CloseableSessionManager c = (CloseableSessionManager) ThreadLocalStorage
                .get(SessionManagerHelper.class);
        if (c == null) {
            IDfSessionManager sessionManager = new DfClientX().getLocalClient()
                    .newSessionManager();
            c = new CloseableSessionManager(sessionManager);
            ThreadLocalStorage.put(SessionManagerHelper.class, c);
        }

        IDfSessionManager sessionManager = c._sessionManager;

        if (!sessionManager.hasIdentity(docbase)) {
            IDfLoginInfo loginInfo = new DfLoginInfo("user", "password");
            loginInfo.setForceAuthentication(false);
            sessionManager.setIdentity(docbase, loginInfo);
        }
        return sessionManager;
    }

    static class CloseableSessionManager implements Closeable {

        private final IDfSessionManager _sessionManager;

        public CloseableSessionManager(IDfSessionManager sessionManager) {
            _sessionManager = sessionManager;
        }

        @Override
        public void close() throws IOException {
            if (_sessionManager == null) {
                return;
            }
            _sessionManager.flushSessions();
        }

    }

}

And now your code might look like:

IDfSession session = SessionManagerHelper.getSession("docbase");
try {
    // business logic here
} finally {
    if (session != null) {
        SessionManagerHelper.release(session);
    }
}

Parallelism challenge

In recent project after refactoring some improperly implemented functionality (don’t worry, that functionality was implemented by previous consulters :)), we realized that we need to delete about 50 million sysobjects, the bad news is documentum performs deletes very slowly – in single-thread operation it’s capable to delete about 50 sysobject per second, i.e. in our case it would take about 12 days – seems not to be a good perspective 😦 The obvious solution was to use multithreading, but I prefer do not code in Java when it comes to perform administration routines. So, i implemented a very basic STDIN multiplexer using bourne again shell:

#!/bin/bash

# amount of processes to spawn
PROCS=$1

# command to spawn
shift
CMD=$@

# opened descriptors
declare -a DESCRIPTORS
# spawned pids
declare -a PROCESSES

# closes all desriptors opened previously
close() {
 for d in ${DESCRIPTORS[*]}; do
  eval "exec $d<&-"
 done
}

# waits all spawned processes
# wait call does not work because
# processes are spawned in subshell
waitall() {
 local pid
 while :; do
  for pid in $@; do
   shift
   kill -0 $pid 2>/dev/null
   if [ "x0" = x$? ]; then
    set -- $@ $pid
   fi
  done
  (("$#" > 0)) || break
  sleep 5
 done
}

# find process's children
findchild() {
 local pid=$1
 while read p pp; do
  if [ "x$pid" = "x$pp" ]; then
   echo $p
  fi
 done < <(ps -eo pid,ppid)
}


# terminates process's tree
killtree() {
 local pid=$1
 local sig=${2-TERM}
 kill -STOP $pid 2>/dev/null
 if [ "x0" = x$? ]; then
  for child in `findchild $pid`; do
   killtree $child $sig
  done
  kill -$sig $pid 2>/dev/null
  kill -CONT $pid 2>/dev/null
 fi
}

# closes all descriptors and
# terminates all spawned processes
abort() {
 local pid
 close
 for pid in ${PROCESSES[*]}; do
  killtree $pid TERM
  killtree $pid KILL
 done
}

# emergency exit: close all descriptors
# and terminate spawned processes
trap "abort" 0

for (( i=0; i<=$PROCS-1; i+=1 )); do
 # spawning command
 exec {FD}> >($CMD) || exit $?
 # storing pid of spawned process
 PROCESSES[$i]=$!
 # storing opened descriptor
 DESCRIPTORS[$i]=$FD
done

while read line; do
 i=$(((i + 1) % $PROCS))
 echo $line >&${DESCRIPTORS[i]}
done

# normal exit: close all descriptors
# and wait spawned processes
close
waitall ${PROCESSES[*]}

trap - 0

Now I’m able to perform deletes in parallel:

 ~$] parallel.sh 40 iapi docbase -Udmadmin -Ppassword -X \
> < delete_objects.api > delete_objects.log

DFC vs Memory

java.lang.String class in Java prior to 7 version has an extremely controversial implementation – all strings are backed by character array and all strings, which originated from the same string using substring() method, are backed by the original character array:

 ~]$ groovysh
Groovy Shell (2.4.1, JVM: 1.6.0_45)
Type ':help' or ':h' for help.
-----------------------------------------------------
groovy:000> s="teststring"
===> teststring
groovy:000> s.dump()
===> <java.lang.String@9d966f23 value=teststring offset=0 count=10 hash=-1651085533>
groovy:000> s1=s.substring(0,2)
===> te
groovy:000> s2=s.substring(2,2)
===>
groovy:000> s1.dump()
// value is the same as in original string, by offset and count differ
===> <java.lang.String@e71 value=teststring offset=0 count=2 hash=3697>
groovy:000> s2.dump()
===> <java.lang.String@0 value=teststring offset=2 count=0 hash=0>

but:

// now we create a "brand new" string
groovy:000> s3=new String(s1)
===> te
groovy:000> s3.dump()
===> <java.lang.String@e71 value=te offset=0 count=2 hash=3697>

In practice this means that if you are going to store strings in memory for a long period of time it might be a good idea to create a “brand new” string to reduce memory usage. How is it related to DFC? DFC has a really weird implementation – it creates “brand new” strings only for values of string attributes:

groovy:000> import com.documentum.com.*
===> com.documentum.com.*
groovy:000> import com.documentum.fc.common.*
===> com.documentum.com.*, com.documentum.fc.common.*
groovy:000> li = new DfLoginInfo("dmadmin", "dmadmin")
===> DfLoginInfo{user=dmadmin, forceAuth=true}
groovy:000> s = new DfClientX().getLocalClient().newSession("ssc_dev", li)
===> com.documentum.fc.client.impl.session.StrongSessionHandle@56092666
groovy:000> d = s.getObjectByQualification("dm_server_config")
===> PROXY@221a5d08[DfSysObject@1c6250d2[....
groovy:000> d.getObjectId().getId().value.length
===> 2782
groovy:000> d.getObjectId().getId().value
===> 2
dm_server_config
3d01ffd780000102 0

OBJ dm_server_config 0 0 0 158
B S 2 A 7 ssc_dev
C S 2 A 16 dm_server_config
D S 2 A 0
E S 2 A 0
F R 2 0
........
groovy:000> d.getObjectName().value.length
===> 7
groovy:000> d.getObjectName().value
===> ssc_dev
groovy:000> d.getFolderId(0).getId().value.length
===> 2782
groovy:000> d.getString("r_object_id").value.length
===> 2782
groovy:000> d.getString("object_name").value.length
===> 7

So, if you are going to store ids in memory you can get a really weird behaviour: you may expect that string with object identifier consumes just 72 bytes (24 bytes for String class (4 bytes for the char array reference, plus 3*4=12 bytes for the three int fields (offset, count and hash), plus 8 bytes of object header) and 48 bytes for character array (12 bytes of header plus 16*2=32 bytes for the sixteen characters)), but in practice it will consume a lot more memory.

Dynamic groups. Advances. Part V

Previously I had written that to utilize DfPrivilegedActionInRole capability in DFC you need either to modify java.policy or create special docbase module. Today, while implementing functionality for project, I revealed that both options are not suitable for that project, so I decided to dig a bit deeper into internals of DfPrivilegedActionInRole/RoleRequestManager classes and after a while I got following alternative for DfPrivilegedActionInRole class, which requires neither docbase modules nor modifying java.policy:

/**
 * @author Andrey B. Panfilov <andrew@panfilov.tel>
 */
public class PrivilegedActionInRole<T> implements PrivilegedAction<T> {

    private final PrivilegedAction<T> _action;

    private final DfRoleSpec _roleSpec;

    public PrivilegedActionInRole(DfRoleSpec roleSpec,
            PrivilegedAction<T> action) {
        _roleSpec = roleSpec;
        _action = action;
    }

    @Override
    public T run() {
        try {
            startPrivilegedRequest(_roleSpec);
            return _action.run();
        } finally {
            stopPrivilegedRequest(_roleSpec);
        }
    }

    public static DfRoleSpec startPrivilegedRequest(String groupName) {
        return startPrivilegedRequest(new DfRoleSpec(groupName));
    }

    public static DfRoleSpec startPrivilegedRequest(String groupName,
            String docbaseName) {
        return startPrivilegedRequest(new DfRoleSpec(groupName, docbaseName));
    }

    public static void stopPrivilegedRequest(DfRoleSpec roleSpec) {
        RoleRequestManager requestManager = RoleRequestManager.getInstance();
        requestManager.pop(roleSpec);
    }

    public static DfRoleSpec startPrivilegedRequest(DfRoleSpec roleSpec) {
        RoleRequestManager requestManager = RoleRequestManager.getInstance();
        requestManager.push(roleSpec, new AccessControlContext(
                new ProtectionDomain[] {}));
        return roleSpec;
    }

}