AdoptOS

Assistance with Open Source adoption

ECM

Inject any custom class or service into web content templates

Liferay - Tue, 04/15/2014 - 16:01
Problem situation

Web content templates are easy to write in Liferay, but as they become more complex they tend to contain a lot of scripting. Moreover, complex Velocity or Freemarker scripts are hard to maintain and even harder to debug. Also, unit testing your scripts is impossible.

Macros

A first possible optimization is to use macros. Generic script blocks can be captured into a macro so the same logic can be used multiple times in your script. You can even reuse macro definitions across multiple templates by defining a generic macro template and including (parsing) that template into other templates that want to use those macros.

However, macros are not that flexible either. References to macro calls are basically replaced at parse time, so they can not really be used as functions. Maintaining the scripts remains difficult and you still don't get to unit test your macro scripts. For complex functionality, writing Java code is clearly the preferred approach.

Velocity tools

Ray Augé of Liferay proposed a way to write custom Velocity utilities in his blog post Custom Velocity Tools. The approach was to create an empty hook project with a proper configuration and to create interface/implementation pairs for each tool you wanted to expose. Those tools could then be referenced with the $utilLocator variable in Velocity or Freemarker templates:

my-template.vm #set ($sayHelloTool = $utilLocator.findUtil("velocity-hook", "be.aca.literay.tool.SayHelloTool"))   $sayHelloTool.sayHello("Peter")

However, there are some limitations with this approach. At deploy time, all Velocity tools are put into a custom context, in which only these tools are accessible. There is no way to use normal beans, loaded by an application context, as Velocity tools. Ideally, when using Spring, you want to scan the classpath of your hook for beans and inject those beans into the Velocity context. This is not possible with Velocity tools.

Example

Suppose we have the following service, defined as a Spring component (@Service could be @Named as well, if you prefer using CDI):

SayHelloService.java @Service public class SayHelloService {       public String sayHello(String name) {         return String.format("Hello, %s!", name);     }      }

This service is added to an application context by classpath scanning. This is configured in the applicationContext.xml file of our hook:

applicationContext.xml <?xml version="1.0" encoding="UTF-8"?> <beans ...>       <context:component-scan base-package="be.aca"/>      </beans>

The goal is to expose and use SayHelloService in our web content templates.

The solution

When deploying a hook with Velocity tools to Liferay, these tools will be stored behind a custom bean locator of Liferay. Check outcom.liferay.portal.bean.BeanLocatorImpl for its implementation. We will create a custom implementation of this bean locator that will search for beans in Spring's application context instead of in the context that Liferay creates for Velocity tools.

Create a custom bean locator

So first we need to implement the com.liferay.portal.kernel.bean.BeanLocator interface. We will only implement the methods that are important for us and let the other methods throw an UnsupportedOperationException.

VelocityBeanLocator.java @Component public class VelocityBeanLocator implements BeanLocator {       private static final String SUFFIX = ".velocity";          public Object locate(String name) {         String realName = stripVelocitySuffix(name);         return SpringBeanLocator.getBean(realName);     }       private String stripVelocitySuffix(String name) {         String realName = name;         if (realName.endsWith(SUFFIX)) {             realName = realName.substring(0, realName.length() - SUFFIX.length());         }         return realName;     }       public ClassLoader getClassLoader() {         throw new UnsupportedOperationException();     }       public String[] getNames() {         throw new UnsupportedOperationException();     }       public Class<?> getType(String name) throws BeanLocatorException {         throw new UnsupportedOperationException();     }       public <T> Map<String, T> locate(Class<T> clazz) throws BeanLocatorException {         throw new UnsupportedOperationException();     } }

The important method here is the locate(String) method. We redirect this method to SpringBeanLocator, which is a static class that holds the ApplicationContext object of Spring. This context will be initialized at deploy time because it implements the ApplicationContextAware interface of Spring:

SpringBeanLocator.java @Component public class SpringBeanLocator implements ApplicationContextAware {       private static ApplicationContext ctx;       @SuppressWarnings("unchecked")     public static <T> T getBean(String className) {         try {             return (T) ctx.getBean(Class.forName(className));         } catch (BeansException e) {             throw new RuntimeException(e);         } catch (ClassNotFoundException e) {             throw new RuntimeException(e);         }     }       public void setApplicationContext(ApplicationContext applicationContext) {         ctx = applicationContext;     }      } Replace Liferay's bean locator with your own

We have to tell Liferay that it has to use your bean locator instead of the default Bean Locator implementation. We will replace the original bean locator in a servlet context listener, which we will configure in the web.xml of our project.

VelocityBeanLocatorContextListener.java 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 public class VelocityBeanLocatorContextListener implements ServletContextListener {       private static final String CONTEXT_CONFIG = "contextConfigLocation";       private XmlWebApplicationContext ctx;       public VelocityBeanLocatorContextListener() {         ctx = new XmlWebApplicationContext();     }       public void contextInitialized(ServletContextEvent sce) {         initializeContext(sce);         String servletContext = sce.getServletContext().getContextPath().substring(1);         PortletBeanLocatorUtil.setBeanLocator(servletContext, new VelocityBeanLocator());     }       private void initializeContext(ServletContextEvent sce) {         ctx.setServletContext(sce.getServletContext());         ctx.setConfigLocation(sce.getServletContext().getInitParameter(CONTEXT_CONFIG));         ctx.refresh();     }       public void contextDestroyed(ServletContextEvent sce) {         ctx.close();     }   }

Line 14 is important here. Here, the bean locator is set to our custom bean locator using the PortletBeanLocatorUtil. The bean locator is only used within the context of this hook, so you don't risk breaking anything in other portlets or hooks. The rest of the code just initializes the application context of Spring.

This context listener has to be configured inside the web.xml so it is called when your hook gets deployed. This is what your web.xml will look like:

web.xml <?xml version="1.0" encoding="UTF-8"?> <web-app ...>          <listener>         <listener-class>be.aca.literay.spring.VelocityBeanLocatorContextListener</listener-class>     </listener>          <context-param>         <param-name>contextConfigLocation</param-name>         <param-value>classpath*:applicationContext.xml</param-value>     </context-param> </web-app> Call your service the normal way in Velocity or Freemarker

When your hook is deployed (don't forget to add an empty liferay-hook.xml file!), you'll be able to call your Spring services by using the utilLocator variable inside Velocity templates.

custom_service.vm #set ($helloTool = $utilLocator.findUtil("velocity-hook", "be.aca.liferay.SayHelloService")) $helloTool.sayHello("Peter")

Or, if you prefer using Freemarker:

custom_service.ftl <#assign sayHelloTool=utilLocator.findUtil("velocity-hook", "be.aca.liferay.SayHelloService")> ${sayHelloTool.sayHello("Peter")}

In the background, the util locator will now pass through your bean locator and your Spring service will be retrieved from the application context, defined in your hook. And we'll have this awesome result:

Mission successful!

Conclusion

The ability to use advanced logic inside web content templates allows you to better separate business logic from presentation logic. Your templates will be much cleaner and only contain presentation logic. Your Java classes will do the heavy work and can be managed, optimized and unit tested properly. Guess this is a win-win situation!

Clone the full example of the code on Github: https://github.com/limburgie/velocity-bean-locator. Your feedback is very welcome!

Peter Mesotten 2014-04-15T21:01:21Z
Categories: CMS, ECM

#07 Liferay São Paulo User Group

Liferay - Tue, 04/15/2014 - 14:48
Olá pessoal   Realizaremos o #07 meetup sobre Liferay no dia 24/04/2014 e le será no iMasters e já temos uma palestra confirmada do Fernando Tadashi (Consultor Liferay) que falará sobre como extender plugins da Marketplace. Gostaria de saber se alguém está afim de palestrar sobre algo que fez com Liferay. Temos um slot disponível ainda. A agenda será a de sempre: 19:15 - Abertura 19:30 - Palestra 1 - Fernando Tadashi - Como extender Plugins da Marketplace 20:15 - Break 20:45 - Palestra 2 - A definir 21:30 - Encerramento   Conto com a presença de vocês. Até lá   Se inscreva no grupo para ficar sabendo sobre novos eventos.   http://www.meetup.com/Liferay-Sao-Paulo-User-Group/     Abraços Paulo Fernandes 2014-04-15T19:48:53Z
Categories: CMS, ECM

Sociaal intranet verhoogt de effectiviteit van organisaties

Liferay - Tue, 04/15/2014 - 03:34

Bijna elke organisatie heeft wel een website die wordt gebruikt als visitekaartje en nieuwskanaal voor de business. Ook een intranet is zeker bij grote organisaties redelijk standaard. Helaas wordt een intranet vaak hoofdzakelijk gebruikt voor het zenden van informatie. Dat is een gemiste kans, want door sociale mogelijkheden toe te voegen, kan de effectiviteit van een onderneming enorm vergroot worden.

In de online wereld kun je praktisch niet meer zonder sociale netwerken. Iedereen is zakelijk en privé vaak op minimaal twee platformen actief en gebruikt die om informatie te delen, te overleggen en om op de hoogte te blijven van de laatste ontwikkelingen. Het is niet zo vreemd dat deze sociale netwerken vaak ook voor werk worden gebruikt. Eigenlijk is dat zonde, want door medewerkers een sociaal intranet te geven kunnen ze veel beter met elkaar samenwerken en kennis delen, waardoor de organisatie effectiever gaat opereren.

Meer dan een smoelenboek
Een van de grootste problemen van traditionele intranet-omgevingen is dat ze vrij statisch zijn. Ze zijn in feite een website voor intern gebruik, waarmee vooral informatie wordt gedeeld, maar weinig mogelijkheden zijn voor interactie of samenwerken. Het omvormen van een bestaand intranet naar een moderner ‘social intranet’ is echter niet zo eenvoudig. Het is meer dan enkel wat buttons toevoegen om berichten te delen op de bekende sociale netwerken. Een sociaal intranet moet een uiterst flexibele omgeving zijn waarin gebruikers wiki’s kunnen bijhouden, documenten kunnen delen, forums kunnen opstarten, blogs kunnen publiceren en net als op Facebook en LinkedIn timeline-berichten met hun volgers kunnen delen. Dat is iets heel anders dan een ouderwets intranet met een digitaal smoelenboek en een plat overzicht van de laatste interne en externe nieuwsberichten.

Integratie met de business
Een sociaal intranet moet medewerkers in staat stellen om op een uiterst flexibele manier met elkaar te kunnen samenwerken en informatie uit te wisselen. Natuurlijk mogen nuttige zaken als een smoelenboek en een nieuwsoverzicht niet ontbreken, maar het moet in de eerste plaats gaan om interactie. Dit vereist een dynamisch platform, waarin de nodige koppelingen en integraties zijn gelegd met de bedrijfsdirectory, het crm-systeem en andere relevante bedrijfsapplicaties. Het platform moet immers onderdeel zijn van de organisatie en geen losse informatiesilo worden. Ook moeten  gebruikersrechten goed geregeld zijn om teamsites te kunnen opzetten voor bepaalde werknemers en toegang tot bepaalde informatie in te kunnen perken. Dit maakt een sociaal intranet duidelijk een stuk complexer dan een standaard intranet, maar de business-case is snel gemaakt als een organisatie de voordelen ervan ervaart.

Betrokkenheid en interactie
Een sociaal intranet is een logische evolutie van de ouderwetse variant. Het motiveert werknemers om betrokken te zijn bij hun organisatie, omdat ze in zo’n omgeving meer kunnen doen dan alleen informatie consumeren. Het geeft ze de middelen om hun interne sociale netwerk te informeren, om kennis te delen en overleg te voeren. En ook niet onbelangrijk: dit alles gebeurt in een veilige, door de werkgever gecontroleerde omgeving. Bedrijven die investeren in een sociaal intranet zullen merken dat het zorgt voor een grotere betrokkenheid bij de organisatie en effectievere samenwerking. Geef je medewerkers de middelen in handen die interactie bevorderen, dan bereik je binnen maanden veel meer dan jarenlang informatie zenden in een ouderwetse intranetomgeving.

 

 

Deze blog post verscheen eerder op www.blogit.nl

Ruud Kluivers 2014-04-15T08:34:52Z
Categories: CMS, ECM

Websites zijn mobiele portalen tot je business

Liferay - Tue, 04/15/2014 - 03:31

In e-commerce is de term ‘responsive design’ al een tijdje in zwang. De meeste online marketeers zijn er intussen van overtuigd dat de conversie van een webshop (ofwel: in welke mate een bezoek wordt afgesloten met een transactie) direct afhankelijk is van de mate waarin een webshop zich aanpast aan mobiele devices. Dit heeft grote gevolgen voor elk bedrijf dat potentiële klanten wil bereiken via zijn webplatform.

Volgens cijfers van ISM eCompany werd in de eerste helft van 2013 al 14,3 procent van alle online transacties van hun klanten gedaan via smartphone of tablet. Het aantal transacties via een desktop nam in de eerste helft van 2013 met 9 procent af ten opzichte van diezelfde periode een jaar eerder. Dat zegt veel over de bereidheid van Nederlanders om online te winkelen via een mobiele apparaat. En nee, dat is niet alleen van belang voor online retailers. Ook voor websites die niet voor directe verkoop worden gebruikt, is het van groot belang rekening te houden met de apparaten die bezoekers gebruiken. Volgens ISM eCompany komt 30 procent van alle websitebezoeken al via mobiele devices, en dat percentage loopt de komende jaren snel op.

Optimale ervaring
De cijfers uit de e-commercewereld liegen er niet om. Mobiel websitebezoek groeit wereldwijd met de dag en zal op een zeker moment de desktop overschaduwen. Voor elk bedrijf is de website de toegangspoort tot de organisatie en daarnaast zie je dat steeds meer organisaties commerciële activiteiten ontplooien via hun webplatform. Maar of je nu medewerkers en klanten informeert of informatie geeft over complexe IT-oplossingen, bezoekers verwachten een optimale gebruikerservaring voorgeschoteld te krijgen. Door websites te voorzien van responsive design, krijgt een potentiële klant of medewerker automatisch de optimale versie van de website te zien. Een high-res visuele ervaring voor tablets, een compacte versie voor smartphones en een volledige variant voor bezoekers die via hun laptop of pc komen.

Universeel platform
Een goed voorbeeld van een bedrijf dat zijn website geoptimaliseerd heeft voor mobiele devices en al vrij snel responsive design toepaste, is One World. Tot deze luchtvaartconglomeratie behoren onder meer American Airlines, British Airways, Cathay Pacific en Iberia. De website fungeert als webshop voor vliegtickets van al deze maatschappijen en is daarnaast het informatiekanaal van de vier partijen. Zo is het momenteel een van de officiële websites die informatie publiceert over de vermiste Malaysia Airlines vlucht MH370. Naast de commerciële rol van een webshop moet One World dus ook voor nieuwsvoorziening toegankelijk zijn voor allerlei mobiele apparaten. De bezoeker krijgt automatisch de informatie gepresenteerd met een gebruikerservaring die aansluit op het apparaat dat hij gebruikt. Binnen milliseconden wordt dit herkend en past de website zich aan.

CMS of portal
Het is tegenwoordig relatief makkelijk om zakelijke websites door middel van responsive design ‘mobile ready’ te maken met behulp van een goede enterprise portal. Een portal is veel flexibeler dan een standaard CMS en biedt brede integratiemogelijkheden, opties voor zowel websites, intranets of extranets en bovendien wordt responsive design in zo’n platform voor al deze doelgroepen out-of-the-box ondersteund. De tijd is voorbij dat bedrijven jarenlang een statische website konden onderhouden, zonder dat dit commerciële gevolgen had. Een enterprise portal biedt de flexibiliteit die hedendaagse organisaties nodig hebben om snel te kunnen meebewegen met de markt. Het vereist echter wel dat bedrijven investeren in een platform dat hen voorziet van flexibele integratiemogelijkheden met hun back-end systemen, social media-ondersteuning en (mobiel) gebruiksgemak. Die investering is het in mijn ogen dubbel en dwars waard, want zonder een mobiele toegangspoort tot je organisatie, komen potentiële klanten geheid bij je concurrenten terecht.

 

Deze blog post verscheen eerder ook op www.blogit.nl

Ruud Kluivers 2014-04-15T08:31:46Z
Categories: CMS, ECM

3 Tips for Incorporating Content and Social Selling into Your Sales Strategy

KnowledgTree - Fri, 04/11/2014 - 10:31

In late 2012, market research firm Forrester dropped a statistic on sales and marketing organizations that likely made them shudder with sudden pangs of helplessness and fear.

According to Forrester’s research, today’s buyers complete as much as 90 percent of their decision-making journey before they ever reach out (or respond) to vendors. They search Google, ask for referrals on social networks, read industry blogs, and digest reviews. And if your business is lucky enough to survive that rigorous filtering process, customer might respond to your outreach.

Yes, marketers can still play an influential role in awareness and guidance during that prolonged period of proactive customer research, and salespeople can still have a big impact on the final steps of a buyer’s journey. But for any of that to happen, both functions need to change the way they operate to become part of the conversation before their customers have already made up their minds.

So, how can you do that?

Content and social selling are two great tactics for engaging in two-way conversations with prospective customers earlier in the buying process. These strategies allow sales and marketing teams to monitor customer activity and interest leading up to a purchase decision, and quickly respond — via customers’ preferred communication channels — to questions or concerns with relevant, helpful information. That proactive approach can go a long way toward building more engaging, trusting, and loyal relationships that, ultimately, spur sales.

Seems obvious enough, right? So how exactly should B2B businesses go about incorporating content and social selling tactics into their sales strategy?

Here are four simple tips:

1. Focus your efforts: While it might be tempting to chase customers down every rabbit hole on the Web, your goal should be to determine the online channels your target customers visit most frequently and then deliver the right content and insight to them there. To make that determination, start by analyzing your best current customers and look for common threads. Content and social selling tools can also be helpful to match the right content marketing assets to the right customer and marketing channel.

2. Listen first and then act: Content and social selling is very different from conventional marketing and sales tactics. It requires two-way conversation and a deeper appreciation of buyers’ needs at each stage of their journey. This makes listening — rather than constantly interrupting to deliver another value prop — a key skill. Follow your customers’ conversations online, identify common challenges, and observe how buyers interact with each other. That insight should help you create more targeted content and dramatically amplify your messaging.

3. Resist the temptation to sell: When you finally feel comfortable offering something of value to your customers, avoid the trap of immediately jumping into a sales pitch. Engaging through content and social channels requires trust, and the only way to earn that is to deliver truly helpful hints, suggestions, and other educational tips that help buyers better understand their pain points. Once you’ve established credibility and it’s clear that buyers are ready to progress through the funnel, you can begin deliver content or commentary that is more directly tied to your product or solution.

Above all, B2B businesses must strive to achieve consistency and relevancy in their content and social selling efforts. The more customers view your business as a critical source of insight and information, the more they’ll lean on you throughout their buying process.

Ultimately, that will help you not only engage customers earlier in the buying cycle (getting a foot in the door well before buyers have completed 90 percent of their buying decision), it will also give your business a significantly better chance of making the final cut.

The post 3 Tips for Incorporating Content and Social Selling into Your Sales Strategy appeared first on KnowledgeTree.

Categories: ECM

Liferay CometD Ajax Push / Liferay Reverse Ajax

Liferay - Fri, 04/11/2014 - 04:46
Introduction:   Ajax Push is a mechanism to push data from server. Generally in the web application when the client request for the server then we will get the dynamic data or updated data from server to web client/browser.   But sometimes we need a mechanism it automatically push data from server to client this is called server push mechanism. Server push is technique to push data from server to client. In this scenarios client will not send request to server but server will automatically push the data to client when it get updated.   To make it work server push technique we will use one protocol calledBayeux protocol.   Bayeux:   Bayeux is a protocol for transporting asynchronous messages (primarily over HTTP), with low latency between a web server and a web client. The messages are routed via named channels and can be delivered   We can use protocol for server to client, client to server, client to client (via the server) The primary purpose of Bayeux is to support responsive bidirectional interactions between web clients, for example using AJAX, and the web server.   Here we will produce data on some channels and client will subscribe channels. When client subscribe channeled as soon as data published on channels will aquatically reached the client who are subscribed channels.   Bayeux Protocol   CometD   CometD is a scalable HTTP-based event routing bus that uses a Ajax Push technology pattern known as Comet   CometD is a Dojo Foundation project to provide implementations of the Bayeux protocol in JavaScript, java, Perl, python and other languages. Other organizations (eg. Sun, IBM and BEA) also have implementations of Bayeux that are not strictly part of the CometD project, but where possible they are linked to from this site.   CometD   Environment:   Liferay 6.2 +Tomcat 7.x+MySQL 5.1   Note:   The code will work for portal 6.2 version you can try for 6.1 too.   Download Liferay CometD Ajax Push Portlet from following location You can find source and war file   https://sourceforge.net/projects/meeralferay/files/LiferayReverseAjaxPortlet/   Portlet Screen-1:     Procedure for deploy portlet ans Use:   You can use war file and directly place in your portal deploy folder and test or you can also use source to deploy portlet.   Once portlet is deployed successfully you can see the portlet in samplecategory name as Liferay Commet D Reverse Ajax.   As soon as you place the portlet in page you can see updated stock prices for every few seconds and it will be updated automatically.   You can also open portlet in another browser there also you can see price changes automatically.   Note:   Before use this portlet please Read entire Article and Use it   Implementation:   We will use Bayeux java implementation for produce data on channels.   Once data produced on channels we will use CometD java script implementation to get the data to client automatically. We have Dojo and JQuery java script implementation for CometD.   ExampleWeb application for CometD   Using ComedD Ajax Push in Liferay Portlet  
  1. Configure CometD servlet in portlet web.xml
  2. Implement Data Producer to produce data
  3. Implement Service to publish data on Channels Using Bayeux
  4. Use CometD java script in client side to subscribe channel and show data.

 

Concept:   We will take an example of stock prices updates so that we will send stock price changes to client when changes in price changes in stocks.   We will use Java Scheduler to produce data for every few second from server side and as soon as data produced we will use CometD service to publish data on channels.   In the client side we will use CometD JQuery to subscribe the channels and show the stock price changes.   Configure CometD servlet in portlet web.xml   We need configure CometD servlet to initiate all the process. We have different servlets one of the servlet AnnotationCometdServlet and need to pass CometD service as init parameter.   The following is code snippet   <servlet>         <servlet-name>cometd</servlet-name>         <servlet-class>org.cometd.annotation.AnnotationCometdServlet</servlet-class>         <init-param>             <param-name>transports</param-name>             <param-value>org.cometd.websocket.server.WebSocketTransport</param-value>         </init-param>         <init-param>             <param-name>services</param-name>             <param-value>com.meera.commetd.StockPriceService</param-value>         </init-param>         <init-param>             <param-name>maxLazyTimeout</param-name>             <param-value>2000</param-value>         </init-param>         <load-on-startup>1</load-on-startup>     </servlet>     <servlet-mapping>         <servlet-name>cometd</servlet-name>         <url-pattern>/cometd/*</url-pattern>     </servlet-mapping>     Implement Data Producer to produce data   We already we need some source need to generate data so we will use some java class to produce data.   We will randomly generate some price for stock quotes and we have used Scheduler Executer to run this task for every few seconds. We have usedStockPriceEmitter java class   ExecutorService   Implement Service to publish data on Channels Using Bayeux   We need service that responsible for publish data on channels with the help of Bayeux protocol. We need create channel and then need to publish   The following example syntax to create channel and publish   Create channel     bayeuxServer.createChannelIfAbsent(channelName, newConfigurableServerChannel.Initializer()             {                 public void configureChannel(ConfigurableServerChannel channel)                 {                     channel.setPersistent(true);                     channel.setLazy(true);                 }             });       Publish data on channel     ServerChannel channel = bayeuxServer.getChannel(channelName);             channel.publish(sender, data);     Use CometD java script in client side to subscribe channel and show data   In the client side will use CometD java script implementation to subscribe channel and show the data.   We need configure CometD servlet URL and need to add two leisters to make hand shake with server.   The following syntax     var cometURL = location.protocol + "//" + location.host + config.contextPath + "/cometd";         cometd.configure({             url: cometURL,             logLevel: 'debug'         });           cometd.addListener('/meta/handshake', _metaHandshake);         cometd.addListener('/meta/connect', _metaConnect);         cometd.handshake();     Subscribe Channel Syntax     cometd.subscribe('/stock/*', function(message)                     { var data = message.data;   });     Note:   We need add CometD java script implementation source in page and we will use cometd is java script object available to call methods.   We have implement application related java script in application.js   The following are all required java script source files     <scriptsrc="http://ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/cometd-namespace.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/cometd-json.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/AckExtension.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/TransportRegistry.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Transport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/RequestTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/WebSocketTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath()%>/js/comet/CallbackPollingTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/LongPollingTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Utils.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Cometd.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/jquery.cometd.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/application.js"></script>     Complete Source code for Implementation   The following is web.xml file     <web-app id="WebApp_ID" version="2.4"xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee  http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">        <display-name>LiferayCommetDRevserseAjax-portlet</display-name>         <servlet>         <servlet-name>cometd</servlet-name>         <servlet-class>org.cometd.annotation.AnnotationCometdServlet</servlet-class>         <init-param>             <param-name>transports</param-name>             <param-value>org.cometd.websocket.server.WebSocketTransport</param-value>         </init-param>         <init-param>             <param-name>services</param-name>             <param-value>com.meera.commetd.StockPriceService</param-value>         </init-param>         <init-param>             <param-name>maxLazyTimeout</param-name>             <param-value>2000</param-value>         </init-param>         <load-on-startup>1</load-on-startup>     </servlet>     <servlet-mapping>         <servlet-name>cometd</servlet-name>         <url-pattern>/cometd/*</url-pattern>     </servlet-mapping>     <filter>         <filter-name>cross-origin</filter-name>         <filter-class>org.eclipse.jetty.servlets.CrossOriginFilter</filter-class>     </filter>     <filter-mapping>         <filter-name>cross-origin</filter-name>         <url-pattern>/cometd/*</url-pattern>     </filter-mapping>        <jsp-config>               <taglib>                      <taglib-uri>http://java.sun.com/portlet_2_0</taglib-uri>                      <taglib-location>                            /WEB-INF/tld/liferay-portlet.tld                      </taglib-location>               </taglib>               <taglib>                      <taglib-uri>http://liferay.com/tld/aui</taglib-uri>                      <taglib-location>/WEB-INF/tld/aui.tld</taglib-location>               </taglib>        </jsp-config> </web-app>     The following is StockPriceServiceJava Class     import java.util.HashMap; import java.util.List; import java.util.Locale; import java.util.Map; import javax.inject.Inject; import org.cometd.annotation.Service; import org.cometd.annotation.Session; import org.cometd.bayeux.server.BayeuxServer; import org.cometd.bayeux.server.ConfigurableServerChannel; import org.cometd.bayeux.server.LocalSession; import org.cometd.bayeux.server.ServerChannel; @Service public class StockPriceService implements StockPriceEmitter.Listener {     @Inject     private BayeuxServer bayeuxServer;     @Session     private LocalSession sender;       public void onUpdates(List<StockPriceEmitter.Update> updates)     {         for (StockPriceEmitter.Update update : updates)         {             // Create the channel name using the stock symbol             String channelName = "/stock/" + update.getSymbol().toLowerCase(Locale.ENGLISH);               // Initialize the channel, making it persistent and lazy             bayeuxServer.createChannelIfAbsent(channelName, newConfigurableServerChannel.Initializer()             {                 public void configureChannel(ConfigurableServerChannel channel)                 {                     channel.setPersistent(true);                     channel.setLazy(true);                 }             });               // Convert the Update business object to a CometD-friendly format             Map<String, Object> data = new HashMap<String, Object>(4);             data.put("symbol", update.getSymbol());             data.put("oldValue", update.getOldValue());             data.put("newValue", update.getNewValue());               System.out.println("===========================");             // Publish to all subscribers             ServerChannel channel =bayeuxServer.getChannel(channelName);             channel.publish(sender, data);         }     } }       The following is StockPriceEmitter java class     import java.util.ArrayList; import java.util.Arrays; import java.util.EventListener; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Random; import java.util.concurrent.CopyOnWriteArrayList; import java.util.concurrent.Executors; import java.util.concurrent.ScheduledExecutorService; import java.util.concurrent.TimeUnit;  public class StockPriceEmitter implements Runnable {        private final ScheduledExecutorService scheduler = Executors                      .newSingleThreadScheduledExecutor();        private final List<String> symbols = new ArrayList<String>();        private final Map<String, Float> values = new HashMap<String, Float>();        private final List<Listener> listeners = newCopyOnWriteArrayList<Listener>();          public StockPriceEmitter() {               symbols.addAll(Arrays.asList("ORCL", "MSFT", "GOOG","YHOO", "FB"));               values.put("ORCL", 29.94f);               values.put("MSFT", 27.10f);               values.put("GOOG", 655.37f);               values.put("YHOO", 17.82f);               values.put("FB", 21.33f);        }        public List<Listener> getListeners() {               return listeners;        }        public void start() {               run();        }        public void stop() {               scheduler.shutdownNow();        }        public void run() {               Random random = new Random();                 List<Update> updates = new ArrayList<Update>();                 // Randomly choose how many stocks to update               int howMany = random.nextInt(symbols.size()) + 1;               for (int i = 0; i < howMany; ++i) {                      // Randomly choose which one to update                      int which = random.nextInt(symbols.size());                      String symbol = symbols.get(which);                      float oldValue = values.get(symbol);                        // Randomly choose how much to update                      boolean sign = random.nextBoolean();                      float howMuch = random.nextFloat();                      float newValue = oldValue + (sign ? howMuch : -howMuch);                        // Store the new value                      values.put(symbol, newValue);                        updates.add(new Update(symbol, oldValue, newValue));               }                 // Notify the listeners               for (Listener listener : listeners) {                      listener.onUpdates(updates);               }                 // Randomly choose how long for the next update               // We use a max delay of 1 second to simulate a high rate of updates               long howLong = random.nextInt(1000);               scheduler.schedule(this, howLong, TimeUnit.MILLISECONDS);        }          public static class Update {               private final String symbol;               private final float oldValue;               private final float newValue;                 public Update(String symbol, float oldValue, floatnewValue) {                      this.symbol = symbol;                      this.oldValue = oldValue;                      this.newValue = newValue;               }                 public String getSymbol() {                      return symbol;               }                 public float getOldValue() {                      return oldValue;               }                 public float getNewValue() {                      return newValue;               }        }          public interface Listener extends EventListener {               void onUpdates(List<Update> updates);        } }     The following is Liferay Portlet Class   import java.io.IOException; import javax.portlet.ActionRequest; import javax.portlet.ActionResponse; import javax.portlet.PortletException; import com.liferay.util.bridges.mvc.MVCPortlet; public class LiferayCommetDReverseAjax extends MVCPortlet {      private StockPriceEmitter emitter;     public void startStockUpdates(     ActionRequest actionRequest, ActionResponse actionResponse)                      throws IOException, PortletException {     emitter = new StockPriceEmitter();  StockPriceService service(StockPriceService)getPortletContext().
getAttribute(StockPriceService.class.getName());   // Register the service as a listener of the emitter   emitter.getListeners().add(service);                       // Start the emitter     emitter.start();               } public void stopStockUpdates( ActionRequest actionRequest, ActionResponse actionResponse) throws IOException, PortletException { //emitter = new StockPriceEmitter(); emitter.stop();               } }     The following is application.js code     (function($) {     var cometd = $.cometd; //alert(cometd);     $(document).ready(function()     {         function _connectionEstablished()         {             $('#body').append('<div>CometD Connection Established</div>');         }           function _connectionBroken()         {             $('#body').append('<div>CometD Connection Broken</div>');         }           function _connectionClosed()         {             $('#body').append('<div>CometD Connection Closed</div>');         }           // Function that manages the connection status with the Bayeuxserver         var _connected = false;         function _metaConnect(message)         {             if (cometd.isDisconnected())             {                 _connected = false;                 _connectionClosed();                 return;             }               var wasConnected = _connected;             _connected = message.successful === true;             if (!wasConnected && _connected)             {                 _connectionEstablished();             }             else if (wasConnected && !_connected)             {                 _connectionBroken();             }         }           // Function invoked when first contacting the server and         // when the server has lost the state of this client         function _metaHandshake(handshake)         {             if (handshake.successful === true)             {                 cometd.batch(function()                 {                     cometd.subscribe('/stock/*', function(message)                     {                                          var data = message.data;                         var symbol = data.symbol;                         var value = data.newValue;                        // alert(symbol);                         var id = 'stock_'+ symbol;                         var symbolDiv=document.getElementById(id);                         if (!symbolDiv)                         {                         symbolDiv = document.createElement('div');                         symbolDiv.id =id;                         document.getElementById('stocks').appendChild(symbolDiv);                         }                         symbolDiv.innerHTML = '<span class="symbol">' + symbol + ': <b>' + value + '</b></span>';                     });                     // Publish on a service channel since the message is for the server only                     //cometd.publish('/stock/*', { name: 'World' });                 });             }         }           // Disconnect when the page unloads         $(window).unload(function()         {             cometd.disconnect(true);         });           var cometURL = location.protocol + "//" + location.host + config.contextPath + "/cometd";         cometd.configure({             url: cometURL,             logLevel: 'debug'         });           cometd.addListener('/meta/handshake', _metaHandshake);         cometd.addListener('/meta/connect', _metaConnect);         cometd.handshake();     }); })(jQuery);     The following is portlet JSP page.     <%@ include file="init.jsp"%>  <scriptsrc="http://ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/cometd-namespace.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/cometd-json.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/AckExtension.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/TransportRegistry.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Transport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/RequestTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/WebSocketTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath()%>/js/comet/CallbackPollingTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/LongPollingTransport.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Utils.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/Cometd.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/jquery.cometd.js"></script>            <script type="text/javascript" src="<%=renderRequest.getContextPath() %>/js/comet/application.js"></script> <%--     The reason to use a JSP is that it is very easy to obtain server-side configuration     information (such as the contextPath) and pass it to the JavaScript environment on the client.     --%> <style> #stocks .symbol b{ color:red; } #controler-table td{ padding-left:30px; } </style> <script type="text/javascript">         var config = {             contextPath: '<%=renderRequest.getContextPath()%>'         };  </script> <portlet:actionURL  var="startStockUpdates" name="startStockUpdates"> </portlet:actionURL> <portlet:actionURL  var="stopStockUpdates" name="stopStockUpdates"> </portlet:actionURL> <h2>Liferay CometD Ajax Push Portlet</h2> <br/> <table id="controler-table">        <tr>               <td><a href="<%=startStockUpdates.toString()%>">Start</a></td>               <td><a href="<%=stopStockUpdates.toString()%>">Stop</a></td>        </tr> </table> <br/> <div id="stocks"></div>     Author Meera Prince Liferay Top Contributor AwardWinner http://www.liferaysavvy.com meera prince 2014-04-11T09:46:40Z
Categories: CMS, ECM

Key Takeaways: 2014 AIIM Conference

Alfresco - Wed, 04/09/2014 - 08:09

Last week, Alfresco was at the 2014 AIIM Conference in lovely Orlando, Florida. We were there talking about the Future of Work, not just in the abstract, but achieving it in today’s workplace. John Newton, one of our founders and CTO, hosted a roundtable discussion on the topic and gave an extremely energetic talk to an overflowing room.

While attending AIIM every year seems excessive for some, I find it is very useful in gauging the evolving attitudes across the industry. While many technology events are overly optimistic about the future of technology, AIIM has drawn many of those concerned with the compliance side of the equation. As a result, all new technology is viewed with a measured amount of skepticism.

A large change that I’ve observed over the years is the shift regarding the cloud. Two years ago, just about every Records Manager in attendance was discussing and sharing ways to keep the cloud out of their organization. This year, everyone acknowledged the reality of the cloud and the need for everyone to adapt.

This is not to say that they all thought the cloud was great. They simply see the cloud already permeating their organizations. That leaves them with a choice to either insure proper governance is applied or to let the cloud ecosystem grow wild. There is obvious reluctance to embrace the cloud, but the reality of the cloud was shared in session after session.

Everyone shared tips about performing due diligence, checking security, understanding privacy rules of different countries, and other details that are not that much different from any acquisition process. There was very little FUD shared about the cloud which was a pleasant surprise. The biggest issue raised was the possible creation of another silo of content but that has been a common concern in the industry for decades.

Even with all this positive direction on the cloud-front, it was not the biggest shift at the AIIM Conference.

The Future in Information Governance

The big take-away from the AIIM Conference was the shift to Information Governance. Over the years, Records Management has been a difficult proposition for people to sell within their organization. People don’t want to spend time to declare records and they don’t want to spend money on a system that is not perceived as adding to the bottom line.

Information Governance changes the baseline for the compliance discussion. While some people merely swap the term Records Management with Information Governance that is far from the depth of the change. Information Governance covers the entire lifecycle of all information, content and data. It isn’t about retention and disposition, but about protection and findability.

This transition from focusing on a piece of content’s business value and not on the risk of keeping the content too long is a well received change. In today’s Information Age, we need to start managing information as an asset and prioritize it as such.

Part of the move was also on display as there was a lot of discussion on how to automate the categorization of information. As every piece of information should be protected as long as it has business value, different types of information will have different lifespans. The U.S. National Archives and Records Administration (NARA) was there sharing their views and bluntly asking for industry advice on how to automate this process. The general opinion of the speakers was that auto-classification of information is the future.

Where you at AIIM? What did you see there that caught you eye?

RESOURCE: Take our Records Management Self-Assessment to compare your current implementation to industry best practices.

Categories: ECM

Community Meeting: Stuttgart (15. April 2014)

Liferay - Mon, 04/07/2014 - 10:44

(english summary below)

Hallo zusammen,

ich bin mal wieder auf Reisen - genauer gesagt beim Training "Administering Liferay Systems" in Stuttgart (Hint: es gibt noch freie Plätze) und habe am Dienstag, 15. April, abend noch nichts vor. Korrektur: Jetzt habe ich etwas vor!

Ich rufe kurzfristig zum Community-Meeting im Café Kaiserbau am Marienplatz in Stuttgart auf, zum freundlichen Gespräch und Austausch bei Bier, Wein oder einem anderen Getränk. Um einen groben Überblick zu haben und einen passenden Tisch zu reservieren, bitte ich um kurze Rückmeldung per Kommentar hier, auf twitter oder per Mail (olaf punkt kock ät liferay punkt com)

Keine Agenda, keine Vorträge (sofern sich nicht jemand aufdrängt), nur nette Unterhaltung. Start: 18:30 Uhr, die genaue Location gebe ich spätestens am Tag vorher hier bekannt (Vorschläge von Ortskundigen sind gern genommen) steht jetzt oben: Cafe Kaiserbau. Ich habe einen Tisch bestellt - bitte probiert entweder "Liferay" oder meinen Namen, wenn ich noch nicht da bin.

the promised english summary

As I'll be in Stuttgart for the upcoming training "Administering Liferay Systems" (which you still can register for), I'm calling for a community meeting. The location will be close to Marienplatz, Time is Tuesday, 15. April 2014, 18:30 (6:30 pm). To ensure we have enough seats, please register by commenting here, on twitter or through mail (olaf dot kock fancy-symbol liferay dot com, go figure). There's no agenda or presentation (unless someone volunteers), just conversation (and some drinks). Location: Cafe Kaiserbau, there's a table for "Liferay" or on my name.

Olaf Kock 2014-04-07T15:44:40Z
Categories: CMS, ECM

Searching entities through custom attribute value.

Liferay - Sat, 04/05/2014 - 06:24

If you need to search Users who has particular custom attribute value .Using liferay expando API ,this can be acheived easly as follows.

String attrValue ="IT";   String attributeName ="user-department-name";   String tableName = ExpandoTableConstants.DEFAULT_TABLE_NAME;   long classNameId =ClassNameLocalServiceUtil.getClassNameId(User.class);     List<ExpandoValue> expandoValues =ExpandoValueLocalServiceUtil.getColumnValues(companyId, classNameId, tableName, attributeName, attrValue, -1,-1);     for(ExpandoValue expandoValue:expandoValues)     try {       long userId = expandoValue.getClassPK();       User user  =UserLocalServiceUtil.getUser(userId);   }   catch(NoSuchUserException nsue) {      _log.error("No Such User Exist");   } }  

 

sushil patidar 2014-04-05T11:24:50Z
Categories: CMS, ECM

Configure SSO in Liferay with OKTA using SAML 2.0 protocol

Liferay - Thu, 04/03/2014 - 08:52

In this blog, I am listing the steps to configure SSO in Liferay with OKTA using SAML 2.0 protocol.

OKTA is an enterprise grade identity management service, built from the ground up in the cloud. Okta identity management service provides directory services, SSO, strong authentication, provisioning, workflow and built in reporting.

If you are not familiar with SAML, check out awesome blog by Mika Koivisto.

I used Liferay 6.1 EE GA2 bundled with Tomcat in this exercise.

I followed these steps: 

  1. Create account at http://www.okta.com/ for enterprise trial.
  2. You will get a confirmation with URL at your email address. You will able to see this screen once you access URL mentioned in mail.

3. Go to Applications tab and add a new application using SAML 2.0

 

4. Provide app name: 

 

5. Define SSO Url, SP entity ID, name ID format and default username at next screen. 

 

6. Make app as internal and finish, once done navigate to SSO tab. 

7. Click view setup instructions: 

Save content of  IDP metdata into octametadata.xml file.

Now we are done with OKTA (IDP) configuration setup.

Configuration at Liferay (SP) Side:

  1. Extract Liferay bundle into some location.
  2. Start the server and deploy SAML plugin downloaded through marketplace.
  3. Paste octametdata.xml file into data folder of Liferay.

 4.  Create the keystore using java key tool along with the public and private keys.

              keytool -genkeypair -alias samlspdemo -keyalg RSA -keysize 2048 -keypass password -keystore data/keystoresp.jks

 5. Once done create portal-ext.properties in Liferay Home and add these lines into this file:  

saml.role=sp

saml.entity.id=samlspdemo 

saml.metdata.paths={location of saved octametdata.xml}

# # Keystore #

saml.keystore.type=jks

saml.keystore.path=${liferay.home}/data/keystoresp.jks

saml.keystore.password=password

saml.keystore.credential.password[samlspdemo]=password

 

# # Service Provider #

saml.sp.default.idp.entity.id=http://www.okta.com/kpqs6np8EEBKPQZCLHXQ

saml.sp.sign.authn.request=true

saml.sp.assertion.signature.required=false

saml.sp.clock.skew=3000

saml.sp.session.keepalive.url=http://localhost:8080/c/portal/saml/idp/keepalive

 6. Restart Liferay to check the functionality.

After this once you click login in Liferay portal, it will redirect you okta sign in page, enter credentials there and if you entered correct credentials it will redirect you to Liferay where you will be automatically login.

Note :

  1. Make sure you create same user in Liferay and OKTA for basic setup.
  2. Assign user to the app you created in OKTA.

 

 

Ankit Srivastava 2014-04-03T13:52:30Z
Categories: CMS, ECM

The Past Failures of Content Management

Alfresco - Thu, 04/03/2014 - 07:53

If you look across the Content Management industry, you will see a lot of vendors trying to provide the ultimate solution for organizations. They have built comprehensive systems with scores of components that can be combined in any manner to solve almost any problem. Yet, when AIIM asked people to share how they store their content … Over 60% of organizations still primarily use Network File Shares. - AIIM 2013 Industry Watch

Given that the audience typically taking AIIM’s surveys are well versed in content and information management, this was a shocking result. Now imagine the state of affairs in organizations that haven’t heard of AIIM.

It is not as if we just recently started trying to disseminate the value of content management technologies to the world. One of our founders, John Newton, has created two successful content management companies over the past few decades. During this time, the industry has continued to struggle to make content management technology ubiquitous. We have invested years trying to build best practices and guidelines to improve the content management profession and bring order to the information chaos.

With all of that time and effort in learning how to deploy content management technology, more projects still continue to fail than succeed. Looking back over our industry’s history, success has been the exception, not the rule.

Why is content management so difficult? Why is user adoption so low?

Part of the problem lays in the very nature of the content management systems that have been created. Vendors have historically focused their roadmap on adding new features. If a vendor added a feature that was well received, that feature quickly became part of the roadmap for their competitors. No vendor wanted to have the least number of checkmarks in the feature column during a prospects evaluation cycle.

Meanwhile, the average person only cares about saving, updating, finding, and sharing information. They want it protected so it isn’t accidently deleted or shared with those that shouldn’t have access to the information. The average person wants to focus on their job and not on records schedules or proper tagging of a document. They don’t want to have to find the right command from a menu system full of 30 options.

This is the world that the content management industry has created. A world where success is rare. Next week, we will look deeper into the differing layers of complexity that has hindered our progress towards a tradition of success.

RESOURCE: Download our Connected Enterprise Survey results to learn what users are saying about enterprise content and who they think should be in control.

Categories: ECM

How to clean previous version of documents from Document and Media

Liferay - Wed, 04/02/2014 - 14:22

If you are using liferay sync you might have run in to issue where you end up having 1000 version of the same document and for 1 MB size document your disk space utilization 1 GB.  

Reason is for every save of document opened from folder mapped using liferay sync it will try to create a version if you are connected to the network and liferay sync is active. 

You can use this Groovy script to clean up  all previous version of the document to clean up the disk space.  Run this script from Control Panel > Server Administration > Script > Select Groovy .

Note: run it carefully this script will delete all the verison of the document for a given folder and it's children in given group/site.

 

import com.liferay.portal.model.User; import com.liferay.portal.service.UserLocalServiceUtil; import java.util.List;   import com.liferay.portal.kernel.repository.model.FileEntry; import com.liferay.portal.kernel.repository.model.FileVersion; import com.liferay.portal.kernel.repository.model.Folder; import com.liferay.portlet.documentlibrary.service.DLAppServiceUtil; import com.liferay.portlet.documentlibrary.service.DLFileVersionLocalServiceUtil; import com.liferay.portlet.documentlibrary.model.DLFileVersion;   // Param1: Group ID // Param2: Folder ID listFiles(16018, 9077187);   def listFiles(groupId, folderId) {     allfiles = DLAppServiceUtil.getFileEntries(groupId,folderId)     for (FileEntry file:allfiles) {          System.out.println("File Title: " + file.getTitle());          System.out.println("File Version: " + file.getVersion());         List results = file.getFileVersions(-1);          latestversion = file.getVersion();          for(FileVersion fv : results){              if(fv.getVersion() != latestversion){                   System.out.println("Deleting >>" + fv.getVersion()  );                   DLAppServiceUtil.deleteFileVersion(file.getFileEntryId(), fv.getVersion());              }          }      }        allfolders = DLAppServiceUtil.getFolders(groupId,folderId);      for (Folder folder:allfolders) {            System.out.println("Folder Name: " + folder.getName());            listFiles(groupId,folder.getFolderId());      } } Mitesh S Panchal 2014-04-02T19:22:55Z
Categories: CMS, ECM

Attention App Developers: Alfresco in the cloud just keeps getting better with new improvements to our API

Alfresco - Wed, 04/02/2014 - 03:30

Alfresco is excited to announce that we have made some important updates to Alfresco in the cloud.

What’s different with this new release? Here are the highlights:

First, we’ve added some new capabilities to our Alfresco One API for app developers. Alfresco has been a leader in the definition and implementation of Content Management Interoperability Services (CMIS), an important content standard that was first introduced in 2010.

As part of our commitment to open standards, in this latest cloud release, we have continued to enhance our support for CMIS with an added new feature called “CMIS Item Support.”

This new capability – which was introduced in the latest version of CMIS – allows app developers to access content that was outside the scope of the initial release of CMIS, most notably content beyond just documents and folders.

For example, let’s say you want to do a query for users. A simple query of “SELECT * FROM cm:person” will show you a list of users in your network. Similarly, a query of “SELECT * FROM cm:person, where cm:organization = Alfresco” will show you the list of users from a particular company. You also get to choose the format of the response, whether XML or JSON.

To learn more about how to use our Cloud API, visit https://www.alfresco.com/develop.

Secondly, this is the first cloud release that used our improved agile process, which allows us to deploy new releases more frequently. This was made possible from our work last year to unify the Community, Enterprise and Cloud code lines.

The code we deployed in this cloud release includes important code fixes, as well as some groundwork for new capabilities that you will see later this year. (Stay tuned!)

We expect to continue making Cloud updates on a regular basis.

Finally, rather than stop and start the Alfresco Cloud in order to deploy the update, we were able to do a “rolling deploy” instead. There was no interruption of service for customers (and no need for the engineering and development teams to get up at the crack of dawn on a Sunday morning.)

We hope you find this most recent update useful!

Also, check out our recent post on why CMIS is important and why we are committed to our work in this area.

Categories: ECM

Five Success Factors for your Invoice Processing Solution

Alfresco - Tue, 04/01/2014 - 07:46

Invoice Processing Solution with Certified Technology Partner, Ephesoft

I was recently in Las Vegas for an event and had a run of luck at the gaming tables. Since this doesn’t happen frequently, I reflected on what were a few things that made the night so fun and successful. As I thought through this, there were strong parallels to an invoice processing solution from our Certified Technology partner, Ephesoft. Like the winning hands I had, Ephesoft is like an “Ace in the Hole” in your Alfresco solution. It adds essential capture and classification to invoice processing which streamlines the front end work and prepares the documents for the workflow process.

In this post, I want to review a successful implementation and then reflect on 5 things to do to ensure you’ve got a winning hand for your business solution.

Case Study: BSA Limited
BSA Limited is Australia’s preeminent domestic satellite and technical services company to the broadcast industry. With 1,000 employees and over 1,400 contractors, BSA needed a simple way to manage invoices for their accounts payable departments across its eight national branches.

BSA selected Zia Consulting, a platinum Alfresco and Ephesoft partner, to develop, pilot, test and implement an integrated Alfresco and Ephesoft solution. Zia consultants worked with BSA to understand their accounts payable process and develop automated workflows to speed up approvals. Using the out-of-the-box features in Alfresco and Ephesoft, Zia was able to quickly build a solution that leverages rules capabilities and custom workflows.

From the project, they learned several keys to success:

1: Only make sure bets
Once you hit a few good hands, it’s easy to think that you’re on that winning streak. That’s where the luck runs out and you start losing. Considering your cards, the cards in play and only making sure bets is essential. This is true also for AP solutions. The ROI from document automation solutions is well documented. The Aberdeen Group’s May 2012 report “Invoice Management in a Networked Economy”. Best in class organizations require less than $2 to pay an invoice while the average company pays over $15. The savings on 10,000 invoices can actually pay for the solution! This solution is a good bet!

2: Keep you money on the table
Casinos won’t let you pull money on and off dynamically – you need to play with your chips in front of you. Similarly, you need to keep track of cash consumed by Accounts Payable. Accurately tracking cash requirements and managing one-time payments can streamline systems. Without accurate information on the amount of cash on hand that AP needs, they can miss discounts for early payments or end up in cost overruns and need to request additional funds.

3. Know when to hold and when to fold
Like making a good bet, sometimes you need to chase a pot and sometimes you need to let it go. Knowing when to cut from the pot can save you from betting longer than you should. Likewise, you should be strategizing and mapping your AP process to regulatory controls and reporting requirements. Organizations with manual processes rely on reactive audits for compliance. Organizations that automate their AP processes can be a step ahead and achieve the savings that come along with it. A 2012 study by Association of Finance Professionals reports that 61% of organizations had an attempted or actual fraud. The typical loss for each fraud incident was over $20,000. Documentation and enforcement of policy and automation systems to these defined processes will ensure that fraud will not affect your organization.

4. Always tip your waitress and dealer
Keeping the people happy around you makes the game run better. Similarly, in Accounts Payable, keeping suppliers, customers and internal finance teams happy makes your job easier. How do you accomplish this? By automating invoice processing, you’ll have the foundation to better manage service level commitments. Now, metrics like DSO, cycle times, cost per invoice, throughput, queue management and productivity will be easily reviewed on a regular basis. By freeing up time and resources used for manual processes to mange automation, you’ll be able to deliver a better level to service to your stakeholders.

5. Change seats at the table every so often
Sitting at the same table gets old, visiting another table can change the game. In the AP world, having the flexibility to change to meet the organizations dynamic strategy is essential. Many times back-office functions have the reputation of not being flexible. Now you can be an agent of change in your organization by being responsive to changes in your business. Automation facilitates the movements of functions in your group and across geographies. Lead the charge in your organization to find new and more efficient ways to process.

Conclusions
Making investments in your business can be risky. By reviewing these 5 lessons-learned, see how you improve your business processes. Alfresco can help mitigate the risk with certified technologies with our partners like Ephesoft.  From the BSA implementation and others, we’ve seen AP solutions like invoice processing have a short payback period with realized ROI. It’s not unlikely to have a 9 month payback period – so take a look at a solution today and you could see results as early as New Years.

 

For more information on solutions from Ephesoft, go to the Alfresco solution showcase.

 

Image credits: Las Vegas Sign:  Patrick O’Brien and Poker Chips:  LLudo

Categories: ECM

Liferay Mobile SDK Now Available

Liferay - Mon, 03/31/2014 - 14:59

Today Liferay released the first version of the Liferay Mobile SDK! [Download | Documentation | Project Page]

The Liferay Mobile SDK makes it super-easy for mobile developers to integrate mobile apps with Liferay Portal, by taking care of common tasks like authentication, exception handling, parsing responses, and exposing Liferay's JSON web services in their preferred mobile app development environment (e.g. Objective-C interfaces for iOS developers, Java interfaces for Android, and potentially others in the future). Custom objects and their web services (created via Service Builder) can also be accessed through the Mobile SDK just as easily as core services.

The Liferay Mobile SDK is compatible with Liferay Portal 6.2 and later. The Mobile SDK's official project page gives you access to the SDK releases, provides the latest SDK news, and has forums for you to engage in mobile app development discussions. Bruno Farache also did an excellent blog post for the beta release earlier this year with some working code examples and technical discussion.

Download and Install

There are two ways to get and use the Mobile SDK:

Liferay IDE / Developer Studio / Eclipse Plugin (for Android apps)

For Android developers, Liferay provides the Liferay Mobile SDK Eclipse plugin for you to use in developing your mobile Android apps. It's powerful Mobile SDK Builder generates libraries that enable your app to communicate with Liferay Portal and with the custom portlet services deployed on your Liferay Poral instance. Check out the Developer Guide for details on how to install the plugin into your environment.

Manual Installation (iOS / Android)

For Android and iOS developers, manual installation is pretty simple: download a JAR or ZIP file and import it into your project, either in Eclipse, XCode, or other development environment. The Developer Guide contains details on how to install the SDK into virtually any mobile development environment.

Versioning

Due to the close relationship between the Mobile SDK and Liferay itself, the Mobile SDK follows a similar release version scheme, and each release works with both CE and EE. Multiple Mobile SDKs can also be used in parallel (e.g. to support multiple major Liferay releases in a given app) thanks to the versioning present in the package namespaces.

Source Code

As an open source project, the Liferay Mobile SDK's source code can be found it its main Github repository or as a downloadable zip bundle.

Contributing

Contributions are the lifeblood of our community, and as an open source project, the Mobile SDK is no different. The process for contribution to the SDK is the same process used for Liferay itself. Simply fork the repository, make your contribution, and issue pull requests to the project lead(s). It's a great way to get involved and to give a little back to our community!

Documentation

The Mobile SDK's official documentation lives in the Liferay Developer Guide, covering everything you need to know, including detailed guides for installation and development in both Java (Android) and Objective-C (iOS and XCode) and will be updated as necessary as new features are added or changed.

Getting Support

Support for the Liferay Mobile SDK is included in Liferay's Enterprise Subscription, which provide regular service packs, a commercial SLA, and more.

If you are using Liferay Community Edition, visit the project site or check out the Mobile SDK Forums to find out more about the myriad avenues through which you can get your questions answered.

Bug Reporting

Like other Liferay projects, the Mobile SDK project uses issues.liferay.com to report and manage bug and feature requests. If you believe you have encountered a bug in the new release (shocking, I know), please be cognizant of the bug reporting standards and report your issue on issues.liferay.com, selecting the Mobile SDK Project and 6.2.0.1 release as the value for the Affects Version/s field.

Feature Requests

If you have a great idea for the next Mobile SDK, be sure to file a Feature Request through the JIRA project or on the Ideas Dashboard (they both go to the same place!). If you have the time, consider contributing your amazing new idea to the project, we in the community would love to see what you've done!

 

 

James Falkner 2014-03-31T19:59:56Z
Categories: CMS, ECM

Alfresco Summit 2014 Save-the-Date

Alfresco - Mon, 03/31/2014 - 03:00

Could it already be time to start talking about Alfresco Summit 2014? Yes, indeed! I am pleased to announce that this year’s conference will take place in San Francisco on September 23rd, 24th, and 25th and London on October 7th, 8th, and 9th! Start making plans to attend now because you won’t want to miss this extraordinary event aimed at both business and technical users. Registration, venue, and pricing information will be announced soon.

The detailed schedule is subject to change but the plan right now is to run the conference with a similar flow as last year. The first day, a Tuesday at both events, is an optional day dedicated to training, a hack-a-thon, and other meetings. Tuesday night we’ll have a welcome reception for everyone and then the main conference will kick off Wednesday morning.

Wednesday and Thursday will be two full days of visionary keynotes, conversations with customers, business tracks, technical tracks, and a solution track.

Wednesday night we’ll have an unforgettable party. The Marketing team tells me they are planning something new and different for that evening–I’m looking forward to seeing what they come up with.

Thursday will be another full day of keynotes, product demos, and breakouts with a closing plenary session at the end of the day. With the conference happening two months earlier than last year it means it is already time to start thinking about submitting a speaking proposal. We are now accepting proposals for presentations. Submit your proposal here. To learn more about what makes a good speaking proposal, see my blog post on ecmarchitect.com.

Alfresco Summit is always such a great way to network and learn from others and the energy at the event is just amazing. All of us at Alfresco are looking forward to seeing all of you again this year in London and San Francisco!

If you would like to be notified when registration begins, sign up to receive updates here.

Categories: ECM

UploadPortletRequest and file size limits

Liferay - Wed, 03/26/2014 - 15:59

Uploading files into portlet is quite common request these days and Liferay offers several ways how to achieve it. The simples way is to use plain HTML form upload and retrieve files on the server side using UploadPortletRequest. It does not need any JavaScript (if you don't want to), one however has to keep in mind that this approach has some limitations. One of these is the size of the file, which can result into the file (its binary content) not being available on the server side.

I'll try to highlight the hurdles we had to get over while implementing this approach on recent project. All examples are based on Liferay EE 6.1.10 ga1.

First part you need to write is the HTML form inside the JSP of your portlet. It shoudl look like following example:

Important is to make sure you:

  • set method="POST" and
  • enctype="multipart/form-data".

Otherwise your file's content will never be sent in the HTTP request body to the server.  Then your portlet code (action method) should look like:

Important parts related to file size:

  1. file may have exceeded the upload limit (100MB by default, depends on configuration)
    • see portal.properties -> com.liferay.portal.upload.UploadServletRequestImpl.max.size
    • there is no easy way how to get the size of the file (when it was too big), it seems to be simply ignored completely (uploadPortletRequest.getFile(paramName) returns null, uploadPortletRequest.getSize(paramName) returns 0).
    • so at least inform the user about the limit (see method validateFileParamNotEmpty())
  2. file may have been stored in memory only (when its < 1kB = < 1024 bytes)
    • then request.getFile(paramName) or request.getFile(paramName, false) would return file handle, but the file is not existing on the disk (was never written into given path), so you cannot read the binary data from it
    • either use request.getFile(paramName, true) or request.getFileAsStream(paramName)

Hope this helps.

Josef Šustáček 2014-03-26T20:59:17Z
Categories: CMS, ECM

Alfresco at the core of iMinds research collaboration

Alfresco - Wed, 03/26/2014 - 03:00

For Flemish-based iMinds – an independent research institute designed to bring together companies, government and non-profit organizations on research projects – collaboration and agility of research is a true competitive advantage.

Innovation is, after all, iMinds’ core business.

“The biggest business problem we had to tackle before Alfresco was that we had an information overload,” said Jeroen Derynck, information and communication technology director for iMinds.

The organization has many users collaborating on research, including an extended network of partners, as well as over 1,000 researchers. This led to thousands of documents being exchanged primarily through personal e-mail and caused major issues of version control.

“Everything had to go through e-mail and people were working from the wrong versions. Instead of continuing the old way of working, we decided that we really had to make a change,” said Derynck.

Today, iMinds has some 50,000 documents stored on the Alfresco platform, as well as several thousand Wikis – all with associated workflows so that users can easily and securely edit, view, and collaborate on documents.

Alfresco has also helped iMinds meet various industry compliance regulations.

“Everything for us is built around the Alfresco platform,” said Derynck. “Our researchers no longer lose valuable time searching for documents or reformatting, so it really has helped us to streamline the research we are doing. It’s really core to our research methodology.”

iMinds currently has more than 2,000 users using Alfresco, which serves as the institute’s central document management and collaboration platform.

“The collaborative platform is really what sets us apart from our peers,” said Derynck. “Our customers love it, our researchers love it, and the platform is quite easy to use. Everyone can upload a document, everyone can edit…you have a team feeling, a feeling of a bonded community. That has been an added value of the Alfresco platform itself.”

To learn more about how iMinds is using Alfresco to enable secure research collaboration, watch this short video.

Categories: ECM

Are You Using the Right Content for the Right Segment of Your Funnel?

KnowledgTree - Mon, 03/24/2014 - 13:51

In many ways, marketing and sales technology has been a revelation for modern businesses. It’s changed the way we operate, improved the efficiency and effectiveness of our activities, provided incredibly rich insight into how, why, when, and where customers engage with our brands, and indelibly altered how we view lead gen and the sales process.

In other ways, however, those tools have also made businesses a little lazy.

Some marketers and salespeople, after all, view automation software as an omnipotent, utopic solution to their lead-to-revenue woes. To convert customers, they just need to insert contact information into a database, deliver messages to those contacts every few days, and pass leads on to the sales team after a pre-determined duration of time.

Voilà! Deals get done. The company hits its revenue targets. And the sales and marketing teams get paid.

If only it actually worked that way.

Yes, marketing and sales automation software is a game-changer, and other similarly designed technologies are helping sales and marketing teams improve the effectiveness of their outreach and operate with the context of much greater customer intelligence. But that’s assuming, of course, that businesses are using that technology in the right ways.

According to BtoB’s 2013 Marketing Automation study, most businesses can’t say that, with 85 percent of companies failing to use marketing and sales technology to the full potential of their investment. The primary reason: A lack of relevant content.

The reality is that sales and marketing automation technology is little more than a collection of code that’s designed to automate or optimize specific tasks. Those tools certainly make it easier for users to communicate with prospects at any point in the sales cycle. But marketers and salespeople must still be equipped with the right content to engage prospects in the right segment of the funnel. Without that, marketing and sales technology is just a collection of bells and whistles.

Making better use of your technology’s functionality begins with creating content that is explicitly mapped to your buyer’s specific journey. You need to identify and understand the various stages prospects go through as they progress down the path to purchase, and determine the content formats (case studies, eBooks, blog posts, white papers, etc.) that best align with those funnel segments.

From there, you can fill in the blanks. Have you already created content that aligns with buyers’ needs at specific stages of your funnel? If so, can that content be tweaked so that it more specifically addresses buyer pains and needs? Are certain content formats or channels delivering better results than others? If so, why? And how can you adjust your content creation and distribution strategies to reflect that insight

The bottom line is that while many B2B buyers travel along similar paths — deviating at times, and progressing or regressing at different velocities — the drivers of their journey can vary greatly. Your content — and your technology — must be leveraged with those things in mind, or you’ll never truly tap into the potential rewards of those resources.

The post Are You Using the Right Content for the Right Segment of Your Funnel? appeared first on KnowledgeTree.

Categories: ECM

Alfresco Certified Technologies: Trusted and Proven Solutions

Alfresco - Mon, 03/24/2014 - 08:27

Content doesn’t live in a vacuum – to achieve business value, Alfresco is connected to other key business applications and technologies.

To support increasing use cases and integrations, Alfresco has built a Certified Technology initiative to promote complementary software products that extend or integrate with Alfresco. Certified Technologies must comply with a defined set of industry-accepted and Alfresco-specific technical standards.  After going through a rigorous evaluation process, we give them a “certified” stamp of approval that has several benefits:

  • Customers benefit by knowing that the certified technology they’ve selected has met Alfresco’s technical standards and is supported by a vendor.
  • System Integrators can choose proven technologies from a growing ecosystem of high quality Alfresco-compatible software and the ability to grow complete Alfresco Business Solutions for their customers.
  • Technology partners benefit by being recognized as leaders in building high quality, standards based products.  They offer more options for their customers to be innovative by providing open solutions with Alfresco.

This program is in the growing stages but we are proud to introduce the following recently certified solutions:

  • Ephesoft: document capture solutions help businesses run more efficiently and respond to changes in a cost effective manner by automatically classifying, separating, sorting and extracting data from documents in paper, fax and electronic formats.
  • connexas for SAP:  connecting SAP to the Alfresco based on SAP Content Server 6.20 (ArchiveLink protocol). In addition, the product also provides the capability to exchange meta data in both directions between Alfresco to SAP.
  • Brava for Alfresco: Brava makes it easy to view, annotate and redact on content in virtually any file format, including Microsoft Office documents, PDFs, CAD drawings, images and more—all from within a single, intuitive interface
  • Formtek EDM:  for managing engineering documents from AutoCAD

For information on Alfresco Certified Technologies and other integrations, check out this page.

Categories: ECM
Syndicate content