Mobile First – Mobile Inspired Editing for WSM

<%stf_SENSATIONAL_CLAIM_OF_FIRST_MOBILE_FIRST_INLINE_EDITING_EXPEIENCE%>

It’s arrived and there have been enablement webinars (available on KC 11.2 release, Mobile Editing Deep Dive) on this new editor experience. The feedback I and others hear from demonstrations are that many feel this is a better approach for editors. Frank Steffen mentioned on the latest call the features are growing for this mobile editor and supported browsers expanding past Chrome for desktop.

Are you using this yet? Do you need a demonstration?

Pairing this with a Responsive Web Design refresh will make your editors and end users on mobile devices happy. We’ve retrofitted a few existing sites with the new Xample project, the OpenText Services RWD Prebuilt Project and similar designs in less than 30 days. If you want OT to do this for you or just give you a kick start reach out.

 

WSM Blossoming in the Spring

The leaves are coming out, the flowers are in bloom, and OpenText Web Site Management (WSM) innovations on top of a refreshed core architecture are slated to roll out or have.

  • Find out about a pre-built HTML5 Responsive Web Design implementation for WSM: http://www.solutionexchange.info/2064.htm
  • SmartTree cross-browser & new public APIs in v11.1 – yes we listen to those ideas on Solution Exchange. # 1 & 2 items complete or on road map.
  • Find out what’s in the soon to be released v11.1 and road map of new features through 2015 https://knowledge.opentext.com/knowledge/cs.dll?func=ll&objId=34146964&objAction=browse
  • Watch Solution Exchange and Twitter for Manuel’s teasers of new features coming in v11.2 ( around Q4) like Tablet editing
  • A new version of Portal Site Management is pending as well. With exciting possibilities with what SAP has done with SAP UI5, and mobile portal. That isn’t the same old grey and orange portal I started working with about a decade ago.

It’s an exciting time to work with OpenText WSM and more to come. Also enjoy seeing the EIM strategy taking hold in R&D and there are exciting new plans to come that could help out customers get much better Customer Experiences as well.

 

Globalized Personali(z|s)ation – Baked in goodness

I’m going to assume you  have an understanding of traditional WCM Deployment Patterns and recognize that the core that all OpenText Web Site Management(RedDot) is a baking machine. When OpenText WSM publishes it renders the approved final version of the content out statically to a location. That doesn’t mean its WCM journey is done. I’m about to describe a method to deploy a global personalization system. However, you could swap out where I have OpenText Delivery Server for ASP.Net, PHP, JSP, RoR, Django, or any language or framework you want. OpenText Professional services has a set of practices to help meld your development methodology with content creation to liberate your application developer. But back to the topic at hand. When I talk about Delivery Server it is often in the context of par-baking, which I have used before seeing Seth’s post or other sources.   The following method of par-baking in particular allows you to optimize personalized delivery without segmenting user storage by location. This scenario has come up many times over the years. Here is a general answer:

By setting up N (you could assume 3) data center locations, with Global Traffic Management (GTM) routing systems could sufficiently route traffic to one of N data centers closest to the client.  This would allow for optimal delivery speed of content, both personalized and not. For this discussion lets assume each data center location would have its own database server locally that does not have replication to the other database instances.  Content would be replicated from WSM Management Server in the current data center out.  This can include DynaMents or XSL configurations.  Any configurations can be promoted from development to all N production data centers using Delivery Server transport packages.  Users and personalization data remain a concern.  WSM Delivery Server with the Developer Toolkit activates SOAP web services that expose an API to edit user data, as well as a SOAP based Web Services Connector in Delivery Server.  With these services and connector, a Delivery Server DynaMent configuration can be developed such that when the profile of a user is updated, the change will be replicated to all data centers via Web Services DynaMents.  The synchronization could be configured for optimal user experience: to occur asynchronously with the use of AJAX.  This would prevent blocking of page loads for distant calls with longer write times.

For global companies performance of their sites is a factor in the User Experience. Having friends and colleagues in Australia and Germany I know this from feedback I’ve gotten working on shared servers. I’m lucky enough to have friends in Europe who now work at Compuware (Gomez) they’ve conveyed the experience gains of having your contents hosted in your region or country so there are real ROIs in investing in global infrastructure. This doesn’t rule out CDNs. In fact this works for active content and user profile synchronization with DR as well which is important to support a CDN strategy for your origin servers. Pairing up this approach with OpenText WSM multi-language capabilities to produce localized sites is always a great combo.

Have thoughts on the approach? Using a database level replication and having a good experience?

Responsive Web Site Management

Responsive Web Design (RWD) is a term bouncing around marketing and web teams. I’m going to assume you’ve read some articles or good books on the topic. Where it hasn’t been heard of yet I’ve injected it into the mind share of a few organizations. Why is this important? The ability to deliver to multiple channels is important and the concept has been in OpenText Web Site Management (WSM) for as long as I’ve worked with it. In WSM a multi-channel support mechanism allows publishing multiple copies of content with different markup in a system called “Project Variants”. Why would I write about this concept that reproduces what the product offers? The reason is it reduces publication effort on your servers, getting the same content onto different devices. This practice empowers your web designers/developers, simply change your CSS/JavaScript and with no changes to your WSM implementation you can support a new device.

Want to learn more about WSM & RWD? There is an upcoming webinar from the WSM product team.

rwd-teaser

The OpenText Global Services team is about to release information on a pre-built WSM project that incorporates a Responsive Web Design on Solution Exchange. Some notes on the goals of that solution:

  • Common Content Types
  • Common UI Elements that support most sites needs
  • Skinnable HTML5 Framework with an active community
  • OpenText Global Services standard practices for navigation, project structure, and editor UX
  • Launch a micro-site or start a site redesign faster
  • Reduce initial setup time to get to greater value features faster
  • Lower cost and risk than implementing WSM from scratch

 

There are number of partners, customers and of course OpenText Global Services working with Responsive Web Designs with WSM. Need help getting started reach out to your preferred integrator or OpenText for help. I’m looking forward to see more launches & relaunches in 2013.

Zombie Undead?

Note: This is a personal perspective, not the official response from OpenText Corporation.

Am I part of the Zombie Apocalypse?

Tony is right he blogs about this product (OpenText Web Site Management, formerly RedDot CMS & LiveServer) at least once every few years (no news). He self-professed fell out of love in 2007 and has gone negative ever since. If this is the first article you’ve seen, from my perspective, RSG steadily puts out blog posts to stir up doubts in peoples current platforms or ones they may be contemplating for selection. These target not just OpenText, although our leadership in Enterprise Information Management (ECM, WCM, DAM, Portal, and more topics they cover) makes us a big target as a lead generation mechanism. All this comes with the not so subtle hint to select RSG for consulting to help pick your next vendor. I’ll steer clear of calling them sensational, but there are times I’ll read the articles myself for bulletin-board material whether on us or a competitor. I don’t mind he sees us as “undead”.  Current and new customers realize to “… put technology of all types, especially content management, into its proper context — It’s neither the reason for success nor the reason for failure. “(Scott Liewehr, 2012). It seems Tony hasn’t noticed OpenText has formalized new channels for the Web Site Management product. Perhaps this calls into question the Portal reviews in comparison to his competitors who have us as part of the solution in the upper right corner (I’m not talking about OT Portal, but we do that too). Interesting when one analyst calls something out no multi-channel and another gives specific examples of web content with multi-channel reuse capabilities. There blog is free and they are selling something just like I am, I wish them the best of luck and I hope they are impartial in their selection process if not their editorial tone.

For me I look at the Vignette acquisition forced a pivot in 2009. OpenText as all the selection analysts say agrees that no single WCM fits every customer need. Many analysts say you may need multiple WCM systems for you enterprise. If that is true why wouldn’t vendors have and be able to maintain a series of solutions that fit more of the customers needs, OpenText have since 2006 on ECM (Document Management). In 2009-2010 OpenText discussed this as a continuum of WCM Maturity, similar to AIIM Maturity model for ECM. This is now really part of a spectrum of simple WCM to WEM. Both OpenText solutions have a place in Customer Experience Management for different organizations, sometimes in the same organization. Sure there are some overlaps but OpenText partners with organization to define when and where to use each or our tools. In 2012 I read The Innovators Dilemma and Understanding Michael Porter. This gave some perspective. What you see from many former mid-market solutions (SDL Tridion, Adobe CQ5, OpenText WSM, SiteCore) is they move up market as new lower cost followers enter the market. Sometimes they succeed or fail, often if they only focus on that up-market move not a core value proposition and the audience interested they won’t evolve and will later see disruption. Often there is quite a few challenges their customers face as they move up-market because resources are thin, partners are hoping on the bandwagon of the hot new WCM in town. With the Vignette acquisition OpenText stopped the WSM push to increasingly up-market and has been refocusing on the core value propositions and value chains. Why did our current customers acquire our software? We are an editor focused, multi-language, multi-variant (read multi-Channel for non-project builders), markup/development language agnostic, not requiring specific developer skills, solution with baking at its core that drew people to the solution in the first place.  In the last two years R&D has focused on paying down our well documented technical debt (today’s Management Server release is a fully retooled 64-bit .Net 4 back-end). We aren’t done. We did this without a forced architectural/language switch like Vignette from v6 to v7 or Documentum, as was originally planned.  This has allowed our customers to enjoy a refreshed SmartEdit UI, add-on of proven enterprise Social in OpenText Tempo Social, and a continued evolution. In North America we’ve refocused on our core, verticals, and Alliances (SAP & Microsoft).

For those of you who’ve picked RedDot or OpenText Web Site Management and long term active partners (OpenText, Enthink, or others) have for the most part seen great value in the solution. I have a view that IT/Software is a human process and always there is room for improvement. If you are a customer struggling I’d hope you’d reach out to renew your partnership with us or a partner we’d recommend to help you in a turn around. Sitting down with Arek and Wojtek (from Enthink) at Enterprise World in November we all saw a bright new era for our customers and prospects with what the OpenText’s EIM strategy can mean as OpenText are no longer scope locked on marketing/developing to support growth in ECM alone. This was a sentiment I got from the WSM customers I spoke with there as well. WCM (or pick your favorite acronym) isn’t the only part of your sites (internal, public or private), and never was.

Now can I look forward to another year as a “zombie” having my team work with our customers and partners. I look forward to the launch and relaunches of some exciting web properties, you want to know more just ask.  My team will soon publicly launch a new solution to support redesigns, micro-sites and illustrate you our standard best practices at a price-point all can budget for. With that I guess I’ll close by saying Happy New Year & enjoy the brains!

OpenText WSM and Wave Together

What follows is a simple recipe for creating an OpenText Wave Application filled with content from OpenText Web Site Management.

Ingredients:
1 existing WSM Project with Articles Content Class, you may have 1 to many available
1 existing Wave Application Template with NewsList and NewStory XML Consumer

Take WSM Project

 

 

 

Add Articles Variant to convert HTML to text for native applications:

 

Here’s a view of the template, not much needed.

 

 

 

Publish that out.

This Delivery Server project already had target DynaMent’s delivering personalized news.

 

 

 

Made a minor change to the target DynaMent to use the variant content

 

 

Some minor tweaks were needed.  Then pointing the URL for the NewsList template to WSM Delivery Server and this is the resulting application:

 

 

 

 

Results:

Cross Platform Native Applications (iOS, Android, Blackberry providing content reuse in a different channel

Next steps:
1) Clean up the content for the application a bit.
2) Examine sharing sample configuration (DS Transport Package, MS Content Class Template export& Documentation) to SolEx. This took about 2 days effort scattered in amongst other tasks, having a sample configuration will dramatically cut development time.

Adventures in Form Handling – II

The next step was generating the query with DS Attributes. Nice thing about SQL is it doesn’t care about line breaks, so you can condense it down to one line. This is good as the XML engine in Delivery Server may not preserve these. I didn’t test conclusively, if you need white-space or formatting for say a REST call you will want to test thoroughly.

This is how I broke down the SQL to ‘templatize’ it for Delivery Server. Each line that needs Delivery Server variables is broken out.

DECLARE @formID as varchar(50) DECLARE @InsertOutputForm table (  form_id varchar(50) );DECLARE @InsertOutputFormFields table( id varchar(50), form_id  varchar(50), name  varchar(50),  type varchar(50)); INSERT INTO dbo.forms (name,type,dsuser) OUTPUT    INSERTED.id as form_id  INTO @InsertOutputForm 
VALUES (N'partnerinfoform', N'form',N'cindy')
SET @formID=(SELECT form_id FROM @InsertOutputForm); INSERT INTO dbo.form_fields (form_id, name, type, data)OUTPUT INSERTED.id, INSERTED.form_id, INSERTED.name, INSERTED.type  INTO @InsertOutputFormFields VALUES 
 (@formID, 'products', 'string', 'products'),
 (@formID, 'solutions', 'string', 'solutions'),
 (@formID, 'successes', 'string', 'successes'),
 (@formID, 'resources', 'string', 'resources'),
 (@formID, 'comments', 'string', 'test'),
 (@formID, 'submit', 'string', ''),
 (@formID, 'step', 'string', '430'),
 (@formID, 'skey', 'string', 'SID-04000407-1F158E8D')
;

Next prepare to drop it into a DS XML file with pseudo inline attribute syntax.

DECLARE @formID as varchar(50) DECLARE @InsertOutputForm table (  form_id varchar(50) );DECLARE @InsertOutputFormFields table( id varchar(50), form_id  varchar(50), name  varchar(50),  type varchar(50)); INSERT INTO dbo.forms (name,type,dsuser) OUTPUT    INSERTED.id as form_id  INTO @InsertOutputForm 
VALUES (N'[#formname#partnerinfoform#]', N'[#formtype#form#]',N'[#rde-fields.user#]')
SET @formID=(SELECT form_id FROM @InsertOutputForm); INSERT INTO dbo.form_fields form_id, name, type, data)OUTPUT INSERTED.id, INSERTED.form_id, INSERTED.name, INSERTED.type  INTO @InsertOutputFormFields VALUES 
 (@formID, '[#fieldname#]', '[#fieldtype#string#]', '[#data#]'),
;
<!-- Split Fields -->
		<rde-dm:attribute mode="write" attribute="request:ft.ssv.fields" value="[#request:_sf_form_fieldnames_#].substring(1)" value-separator="|" />
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment1" value="DECLARE @formID as varchar(50) DECLARE @InsertOutputForm table (  form_id varchar(50) );DECLARE @InsertOutputFormFields table( id varchar(50), form_id  varchar(50), name  varchar(50),  type varchar(50)); INSERT INTO dbo.forms (name,type,dsuser) OUTPUT    INSERTED.id as form_id  INTO @InsertOutputForm " />
		<!-- replace pseudo code -->
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment2" value="VALUES (N'[#request:_sf_form_name_#partnerinfoform#]', N'[#request:formtype#form#]',N'[#user:rde-fields.login#anonymous#]') "/>
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment3" value="SET @formID=(SELECT form_id FROM @InsertOutputForm); INSERT INTO dbo.form_fields (form_id, name, type, data) OUTPUT INSERTED.id, INSERTED.form_id, INSERTED.name, INSERTED.type  INTO @InsertOutputFormFields VALUES " />
		
		<!-- Loop Over Fields -->
		<rde-dm:attribute mode="for-each" attribute="request:ft.ssv.fields" alias="field" tag="fields">
			<rde-dm:attribute mode="condition">
				<rde-dm:constraint>context:field NE ""</rde-dm:constraint>
				<field>
					<!-- Get Field Name -->
					<![CDATA[-]]><rde-dm:attribute mode="read" attribute="context:field"/><![CDATA[:]]><rde-dm:attribute mode="read" attribute="request:[#context:field#]" /><![CDATA[;<br/>]]>
					<rde-dm:attribute mode="write" attribute="request:ft.query.segment4" value="[#request:ft.query.segment4#], (@formID, '[#context:field#].replace(';',', ')', '[#fieldtype#string#]', '[#request:[#context:field#badField#]#].replace(';',', ')')"/>
					
					<rde-dm:attribute mode="write" attribute="request:rdb.columns" value="[#request:rdb.columns#], form_fields.[#context:field#]"/>
					<rde-dm:attribute mode="write" attribute="request:rdb.values" value="[#request:rdb.values#],&quot;[#request:[#context:field#badField#]#].replace(';',', ')&quot;"/>
				</field>
			</rde-dm:attribute>
		</rde-dm:attribute>
		
		<rde-dm:attribute mode="write" attribute="request:ft.rdb.formquery" value="[#request:ft.query.segment1#] [#request:ft.query.segment2#] [#request:ft.query.segment3#] [#request:ft.query.segment4#].trim().substring(1);" value-separator="" />
		<![CDATA[<br/>]]>
		<rde-dm:attribute mode="read" attribute="request:ft.rdb.formquery" /><![CDATA[<br/>]]>
		<!-- TODO: 
			- sql query submit
			- validate
			- add date&time of submit
			- set campaign step code = true
			- 
		-->
		<rde-dm:rdb mode="query" alias="otwsm_supplemental" sql="select count(*) as count from dbo.form_fields;"/><![CDATA[-pre count <br/>]]>
		<rde-dm:rdb mode="update" alias="otwsm_supplemental" sql="[#request:ft.rdb.formquery#]"/>
		<rde-dm:rdb mode="query" alias="otwsm_supplemental" sql="select count(*) as count from dbo.form_fields;"/><![CDATA[- post count<br/>]]>

Unfortunately I quickly learned I should have RTFM’d. I ran straight into a blocker.


Unexpected error occurs:This statement isn't allowed=DECLARE @formID as varchar(50) DECLARE @InsertOutputForm table ( form_id varchar(50) )

The RDB DynaMent limits the types of commands that can be executed. Thus I seemed out of luck. Except their is the statement mode which I don’t believe I’ve used before. It will let you run unrestricted SQL statements. If you are running simpler queries do use ‘query’ or ‘update’ to help prevent SQL Injection.

<!-- Split Fields -->
		<rde-dm:attribute mode="write" attribute="request:ft.ssv.fields" value="[#request:_sf_form_fieldnames_#].substring(1)" value-separator="|" />
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment1" value="DECLARE @formID as varchar(50) DECLARE @InsertOutputForm table (  form_id varchar(50) );DECLARE @InsertOutputFormFields table( id varchar(50), form_id  varchar(50), name  varchar(50),  type varchar(50)); INSERT INTO dbo.forms (name,type,dsuser) OUTPUT    INSERTED.id as form_id  INTO @InsertOutputForm " />
		<!-- replace pseudo code -->
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment2" value="VALUES (N'[#request:_sf_form_name_#partnerinfoform#]', N'[#request:formtype#form#]',N'[#user:rde-fields.login#anonymous#]') "/>
		<rde-dm:attribute mode="write" attribute="request:ft.query.segment3" value="SET @formID=(SELECT form_id FROM @InsertOutputForm); INSERT INTO dbo.form_fields (form_id, name, type, data) OUTPUT INSERTED.id, INSERTED.form_id, INSERTED.name, INSERTED.type  INTO @InsertOutputFormFields VALUES " />
		
		<!-- Loop Over Fields -->
		<rde-dm:attribute mode="for-each" attribute="request:ft.ssv.fields" alias="field" tag="fields">
			<rde-dm:attribute mode="condition">
				<rde-dm:constraint>context:field NE ""</rde-dm:constraint>
				<field>
					<!-- Get Field Name -->
					<![CDATA[-]]><rde-dm:attribute mode="read" attribute="context:field"/><![CDATA[:]]><rde-dm:attribute mode="read" attribute="request:[#context:field#]" /><![CDATA[;<br/>]]>
					<rde-dm:attribute mode="write" attribute="request:ft.query.segment4" value="[#request:ft.query.segment4#], (@formID, '[#context:field#].replace(';',', ')', '[#fieldtype#string#]', '[#request:[#context:field#badField#]#].replace(';',', ')')"/>
					
					<rde-dm:attribute mode="write" attribute="request:rdb.columns" value="[#request:rdb.columns#], form_fields.[#context:field#]"/>
					<rde-dm:attribute mode="write" attribute="request:rdb.values" value="[#request:rdb.values#],&quot;[#request:[#context:field#badField#]#].replace(';',', ')&quot;"/>
				</field>
			</rde-dm:attribute>
		</rde-dm:attribute>
		
		<rde-dm:attribute mode="write" attribute="request:ft.rdb.formquery" value="[#request:ft.query.segment1#] [#request:ft.query.segment2#] [#request:ft.query.segment3#] [#request:ft.query.segment4#].trim().substring(1);" value-separator="" />
		<![CDATA[<br/>]]>
		<rde-dm:attribute mode="read" attribute="request:ft.rdb.formquery" /><![CDATA[<br/>]]>
		
		<rde-dm:rdb mode="query" alias="otwsm_supplemental" sql="select count(*) as count from dbo.form_fields;"/><![CDATA[-pre count <br/>]]>
		<rde-dm:rdb mode="statement" alias="otwsm_supplemental" sql="[#request:ft.rdb.formquery#] SELECT id, form_id, name, type FROM @InsertOutputFormFields;"/>
		<rde-dm:rdb mode="query" alias="otwsm_supplemental" sql="select count(*) as count from dbo.form_fields;"/><![CDATA[- post count<br/>]]>
		<!-- TODO: 
			- add date&time of submit
		-->
		<rde-dm:attribute mode="read" attribute="request:step" /><![CDATA[-step<br/>]]>
		<rde-dm:attribute mode="write" op="set" attribute="user:campaign.step[#request:step#]" value="true"/> Set to True
		<rde-dm:attribute mode="read" attribute="user:campaign.step[#request:step#]" /><![CDATA[-step<br/>]]>
	<rde-dm:attribute mode="condition">
		<rde-dm:constraint>(request:redirect-target NE '') AND (request:debug NE 'true')</rde-dm:constraint>
			Redirect Now!<!-- redirect somewhere -->
			<rde-dm:process mode="redirect" type="http" url="http://10.25.0.51/demo/en/partner.htm" >
				<rde-dm:include content="[#request:redirect-target#content/en/index.htm#]"/>
			</rde-dm:process>
	</rde-dm:attribute>

That is the working result of a form handler to work with the “universal” storage tables handling a SmartForm post.

Adventures in Form Handling

In part this week I’m working on a form handling setup. I created a SmartForm and started about writing it’s input to DB via Delivery Server.  I decided to make a more generic handler. We have some tools around for processing (client side validation, server side validation, JCaptcha) but no universal storage mechanism at hand. Conceptually this same processing used in tandem with SmartForm, forms created by dragging and dropping from panels in SmartEdit, a template form, or an external or legacy form. I’ve decided to just store this in a couple of tables as it will allow flexibility to integrate with other applications later on.

Goal:

Create a Delivery Server form handler to store M forms with N form fields per form. (ok I don’t really expect it to scale unlimited)

DDL:

First I created some tables. here they are.

MS SQL 2008 DB Diagram

USE [otwsm_supplemental]
GO

/****** Object:  Table [dbo].[forms]    Script Date: 08/03/2011 09:24:40 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[forms](
	[id] [uniqueidentifier] ROWGUIDCOL  NOT NULL,
	[name] [nvarchar](50) NOT NULL,
	[type] [nvarchar](50) NOT NULL,
	[dsuser] [nvarchar](50) NOT NULL,
 CONSTRAINT [PK_forms] PRIMARY KEY CLUSTERED
(
	[id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

ALTER TABLE [dbo].[forms] ADD  CONSTRAINT [DF_forms_id]  DEFAULT (newid()) FOR [id]
GO

ALTER TABLE [dbo].[forms] ADD  CONSTRAINT [DF_forms_type]  DEFAULT (N'form') FOR [type]
GO

ALTER TABLE [dbo].[forms] ADD  CONSTRAINT [DF_forms_user]  DEFAULT (N'anonymous') FOR [dsuser]
GO

USE [otwsm_supplemental]
GO

/****** Object:  Table [dbo].[form_fields]    Script Date: 08/03/2011 09:24:27 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[form_fields](
	[id] [uniqueidentifier] NOT NULL,
	[form_id] [uniqueidentifier] NOT NULL,
	[name] [nvarchar](50) NULL,
	[type] [nvarchar](50) NULL,
	[data] [nvarchar](max) NULL,
 CONSTRAINT [PK_form_fields] PRIMARY KEY CLUSTERED
(
	[id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

ALTER TABLE [dbo].[form_fields]  WITH CHECK ADD  CONSTRAINT [FK_form_fields_forms] FOREIGN KEY([form_id])
REFERENCES [dbo].[forms] ([id])
GO

ALTER TABLE [dbo].[form_fields] CHECK CONSTRAINT [FK_form_fields_forms]
GO

ALTER TABLE [dbo].[form_fields] ADD  CONSTRAINT [DF_form_fields_id]  DEFAULT (newid()) FOR [id]
GO

ALTER TABLE [dbo].[form_fields] ADD  CONSTRAINT [DF_form_fields_type]  DEFAULT (N'string') FOR [type]
GO

SQL:

I decided I wanted to insert from a form post with one interaction between DS and SQL.  For now I decided not to make a prepared statement even though they are supported in v10.1.

(If there are better ways to do the insert let me know in the comments)

DECLARE @formID as varchar(50)
DECLARE @InsertOutputForm table

(

  form_id varchar(50)

);

DECLARE @InsertOutputFormFields table

(

  id varchar(50),
  form_id  varchar(50),
  name  varchar(50),
  type varchar(50)

);

INSERT INTO dbo.forms (name,type,dsuser)

OUTPUT

    INSERTED.id as form_id

  INTO @InsertOutputForm

VALUES (N'test', N'partnerinfoform',N'cindy')

SET @formID=(SELECT form_id FROM @InsertOutputForm);

INSERT INTO dbo.form_fields (form_id, name, type, data)

OUTPUT

    INSERTED.id, INSERTED.form_id, INSERTED.name, INSERTED.type

  INTO @InsertOutputFormFields

 VALUES (@formID, 'products', 'string', 'products'),
(@formID, 'solutions', 'string', 'solutions'),
(@formID, 'successes', 'string', 'successes'),
(@formID, 'resources', 'string', 'resources'),
(@formID, 'comments', 'string', 'test'),
(@formID, 'submit', 'string', ''),
(@formID, 'step', 'string', '430'),
(@formID, 'skey', 'string', 'SID-04000407-1F158E8D');

SELECT id, form_id, name, type FROM @InsertOutputFormFields;

RESULTS:

After struggles with getting to know the output clause and the proper variable syntax I finally got MS SQL 2008 to do all the work.

MS SQL 2008 result set
Query Result

Next step. Writing finishing DS page logic to format the query.

 

DS API and RQL via AJAX – A Test Drive

Recently in the community I’ve been advocating and drumming up support for DS REST API and Jian, Kim, and Manuel have been discussing MS Plugins using AJAX to call RQL. I recently got an opportunity to give them a test drive to see how they will work for rapid prototyping a solution for a Proof of Concept (POC). I was impressed.

The scenario (as originally understood):

To create a set of JSP pages that provide access to WSM

1)DS to display different contents based on date range, resolution, and later an audience category for personalization.

2) Remote access to Asset Manager contents, and on selection return relative paths for where published files would reside.

DS REST API In Depth:

After installing (used the quick start guide for backup) and testing base functionality. I jumped back to Management Server to create contents with metadata to drive the searches. Some assigned by categories and keywords, some by standard fields, and some embedded in the content class templates. After publishing I created a redundant database structure.

Screen capture of attributes and constraint

I created a new handler by copying the existing Target DynaMent handler. I removed authentication for the handler, as this is a POC it just make things simpler. I developed a query to the  specified my parameters.

http://host/api/1/newtarget.html?dbs=wcs&project=demo&include-mode=content,content&chunksize=1&chunk=1&sortedby=cms.date.wcsdatetime&ignore-constraints=completely&sortorder=desc&attributepath=none&constrainttime=20110623&hres=300

I added a couple DynaMents to write constraints for me based on values of ‘constrainttime’ and ‘hres’. And I had a single content returned per date/resolution. I finished it up by adding a simple HTML XSLT so the content of this one item is accessible as if it were called directly. The rest was some JSP to scrape characters from a URL into a StringBuffer. Much more sophisticated option is possible but it is a POC.

RQL via AJAX

var loginguid=””;
var sessionguid=””;$(document).ready(function() {// hide search result area
$(“#searchresult”).hide();
if($.cookie(“assetlogin”) == null || $.cookie(“assetsession”) == null || $.cookie(“assetlogin”) == “” || $.cookie(“assetsession”) == “”)
{
login();
}else{
if(“<%= session(“sessionkey”) %>” ==”” || “<%= session(“loginguid”) %>”==””){
loginguid=$.cookie(“assetlogin”);
sessionguid=$.cookie(“assetsession”);
DisplayAssets(“FE46C4F25F4847F4A983C6F41EDC42A3”);
}else{
//alert(“asp vars”);
loginguid=”<%= session(“loginguid”) %>”;
sessionguid=”<%= session(“sessionkey”) %>”;
DisplayAssets(“FE46C4F25F4847F4A983C6F41EDC42A3”);
}
}
});

function DisplayAssets(FolderGuid)
{
if(FolderGuid == “”)
{
$(“#searchresult .content”).append(“<div class=\”error\”>Error</div>”);
return;
}

$(“#searchresult .content”).empty();

//load simple page info
var strRQLXML = padRQLXML(“<MEDIA><FOLDER guid=\””+FolderGuid+ “\” subdirguid=\””+FolderGuid+”\”><FILES action=\”list\” view=\”list\” sectioncount=\”-1\” maxfilesize=\”0\” attributeguid=\”\” searchtext=\”*\” pattern=\”\” startcount=\”1\” orderby=\”name\”/></FOLDER></MEDIA>”);
$.post(“/CMS/PlugIns/RemoteAssetManager/rqlaction.asp”, { rqlxml: strRQLXML },
function(data){
// add stuff to search results
if($(data).find(“FILE”).attr(“name”) != null)
{
//add on click alert http://host/saq/images/imagename
$(data).find(‘FILE’).each(function(){
$(“#searchresult .content”).append(“<div style=\”border: 2px solid grey; padding 15px;\”><a onclick=\”alert(‘URL for Integration Externally: http://host/url/images/”+$(this).attr(“name”)+”‘)\” href=\”#img=” + $(this).attr(“name”) + “\”><br/><img style=\”border:3px solid black;\” src=\”http://win-kk55dom76sa/cms/”+$(this).attr(“thumbnailpath”)+”\” /><br/>” + $(this).attr(“name”) + “</a><br/> </div>”);
});
}
else
{
$(“#searchresult .content”).append(“<div class=\”error\”>Folder with guid ” + FolderGuid + ” not found.</div>”);
}
$(“#searchresult”).show();
}, “xml”);
}

function login(){
// login
var strLoginRQLXML = “<IODATA> <ADMINISTRATION action=\”login\” name=\”admin\” password=\”dontdothis\”/></IODATA>”;
$.post(“/CMS/PlugIns/RemoteAssetManager/rqlaction.asp”, { rqlxml: strLoginRQLXML },
function(data){
//handle the login request
loginguid=$(data).find(‘LOGIN’).attr(“loginguid”);
userguid=$(data).find(‘LOGIN’).attr(“userguid”);
//sessionguid=$(data).find(‘LOGIN’).attr(“guid”);
$.cookie(“assetlogin”, loginguid);
//alert(“loginguid: “+loginguid);
//”<IODATA loginguid=\”<%= session(“loginguid”) %>\” sessionkey=\”<%= session(“sessionkey”) %>\”>” + innerRQLXML + “</IODATA>”;
var strProjectRQLXML = padRQLXMLNoSession(“<ADMINISTRATION action=\”validate\”><PROJECT guid=\”0B7FE095D7814EE48B95B2E2A41A0BA0\” /></ADMINISTRATION>”);
//alert(strProjectRQLXML);
$.post(“/CMS/PlugIns/RemoteAssetManager/rqlaction.asp”, { rqlxml: strProjectRQLXML },
function(data){
//handle the login request
sessionguid=$(data).find(‘SERVER’).attr(“key”);
$.cookie(“assetsession”, sessionguid);
DisplayAssets(“FE46C4F25F4847F4A973C6F41EDC42A3”);
}, “xml”);
return “”;
}, “xml”);
return “”;
}

function padRQLXML(innerRQLXML)
{
return “<IODATA loginguid=\””+loginguid+”\” sessionkey=\””+sessionguid+”\”>” + innerRQLXML + “</IODATA>”;
}
function padRQLXMLNoSession(innerRQLXML)
{
return “<IODATA loginguid=\””+loginguid+”\”>” + innerRQLXML + “</IODATA>”;
}

This made a quick and easy UI to include via a similar JSP to the ones used for Delivery Server. This could be expanded on by

Result:

Both satisfied the technical requirements as they were understood. Some on the fly reconfiguration during a break was able to more accurately meet the customers requirements.

Time Frame:

Total time <16 hrs

Facts:
This includes download, install, configuration of REST project. A good amount of this time was spent coding the JSP to include the results of my REST services. I will be honest I’m really a novice with limited hands on experience. Only some rusty revision of existing RQL scripting as background and an existing AJAX plugin as guide.

 

WAR Deploy of OT WSM Delivery Server on vFabric tc Server

Here is a quick log of an experiment. vFabric tc Server is a Tomcat clone “Enterprise”-ified by the SpringSource now VMWare folks. It has hooks setup into their Hyperic Analysis tool that I’ve been meaning to look at for a while.

  1. Install tc Server – pretty easy with 60 day demo.
  2. Download and extract DS 10.1 SP1 (pre-patched) from OpenText KC
  3. Download and extract SQL JDBC 1.2
  4. Make sure you have a working SQL 2008 DB up and running. Make sure you create database users. For dev I always make sure to use SQL mixed mode as its much quicker to provision new users.
  5. Make sure you have license keys needed for install
  6. Kick off DS setup for 64-bit
  7. Pick Web Archive (.war) deployment
  8. Select region (America/Australia)
  9. Next+Next
  10. Import license XML file (ah so much better than old copy and paste)
  11. Set target path (not the actual tc Server webapps yet)
  12. Next-Next-Next-Next-Next
  13. Set administrative user and pwd
  14. Pick your 1.2 sqljdbc.jar
  15. Enter credentials. If using a named instance with SQL Server edit the url and remove the :<%port%> host\instance;
  16. allow network access (if applicable)
  17. Use default db names or rename to hearts desire
  18. Next Next
  19. Wait to make sure Delivery Server Prepartion Completes- Last line should read “Initialisation- insert licenses”
  20. Next (if no sample projects) Next Next
  21. Done. with WAR creation.
  22. Stop tc Server if running
  23. Next stops parallel but not directly follow the manual for deploying WAR file
  24. Copy jar files into vfabric-tc-server\myserver\lib
  25. Copy WAR file into vfabric-tc-server\myserver\webapps
  26. Update vfabric-tc-server\myserver\bin\setenv.bat. Here is what I set my last 2 lines to hybridizing some of the rdesetenv.bat optional settings with the tc Server defaults
    set JVM_OPTS=-Xmx512M -Xss192K -XX:MaxPermSize=128m
    set JAVA_OPTS=%JVM_OPTS% %AGENT_PATHS% %JAVA_AGENTS% %JAVA_LIBRARY_PATH%  -Duser.language=en -Djava.net.preferIPv4Stack=true -XX:+DisableExplicitGC -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000 -server -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode -XX:+UseParNewGC -XX:+CMSClassUnloadingEnabled -XX:+CMSPermGenSweepingEnabled
  27. Save and Start tc Server
  28. Login to <host>:<port>/cps/rde/iauth with your user and password.
  29. Do your own verification of functionality because this is as far as my experiment has gone thus far.

Enjoy.