Thursday, January 29, 2009

SQLStream Simplifies Event Stream Processing

I spoke earlier this week with SQLStream, which offers software to execute queries against data streams such as stock market prices, Web logs and credit card transactions. These queries can include on-the-fly calculations such as moving averages, as well as scans for patterns like a sequence of failed log-in attempts. Typical applications include security monitoring, fraud detection, and general business activity monitoring. Marketers can use the queries to identify new leads and select cross-sell and upsell offers. Although the connection is a little less obvious, the system can also be used as an alternative to conventional batch data preparation methods for tasks like customer data integration.

SQLStream’s particular claim to fame is that its queries are almost identical to garden-variety SQL. Other vendors in this space apparently use more proprietary approaches. I say “apparently” because I haven’t researched the competition in any depth. A quick bit of poking around was enough to scare me off: there are many vendors in the space and it is a highly technical topic. It turns out that stream processing is one type of “complex event processing,” a field which has attracted some very smart but contentious experts. To see what I mean, check out Event Processing Thinking (Opher Etzion) and Cyberstrategics Complex Event Processing Blog (Tim Bass). This is clearly not a group to mess with.

That said, SQLStream’s more or less direct competitors seem to include: Coral8, Truviso, Progress Apama, Oracle BAM, TIBCO BusinessEvents, KX Systems, StreamBase and Aleri . For a basic introduction to data stream processing, see this presentation from Truvisio.

Back to SQLStream. As I said, it lets users write what are essentially standard SQL queries that are directed against a data stream rather than a static table. The data stream can be any JDBC-accessible data source, which includes most types of databases and file structures. The system can also accept streams of XML data over HTTP, which includes RSS feeds, Twitter posts and other Web sources. Its queries can also incorporate conventional (non-streaming) relational database tables, which is very useful when you need to compare streamed inputs against more or less static reference information. For example, you might want to check current activity against a customer’s six-month average bank balance or transaction rate.

The advantages of using SQL queries are that there are lots of SQL programmers out there and that SQL is relatively easy to write and understand. The disadvantage (in my opinion; not surprisingly, SQLStream didn’t mention this) is that SQL is really bad at certain kinds of queries, such as queries comparing subsets within the query universe and queries based on record sequence. Lack of sequencing may sound like a pretty big drawback for a stream processing system, but SQLStream compensates by letting queries specify a time “window” of records to analyze. This makes queries such as “more than three transactions in the past minute” quite simple. (The notion of “windows” is common among stream processing systems.) To handle subsets within queries, SQLStream mimics a common SQL technique of converting one complex query into a sequence of simple queries. In SQLStream terms, this means the output of one query can be a stream that is read by another query. These streams can be cascaded indefinitely in what SQLStream calls a “data flow architecture”. Queries can also call external services, such as address verification, and incorporate the results. Query results can be posted as records to a regular database table.

SQLStream does its actual processing by holding all the necessary data in memory. It automatically examines all active queries to determine how long data must be retained: thus, if three different queries need a data element for one, two and three minutes, the system will keep that data in memory for three minutes. SQLStream can run on 64-bit servers, allowing effectively unlimited memory, at least in theory. In practice, it is bound by the physical memory available: if the stream feeds more data than the server can hold, some data will be lost. The vendor is working on strategies to solve this problem, probably by retaining the overflow data and processing it later. For now, the company simply recommends that users make sure they have plenty of extra memory available.

In addition to memory, system throughput depends on processing power. SQLStream currently runs on multi-core, single-server systems and is moving towards multi-node parallel processing. Existing systems process tens of thousands of records per second. By itself, this isn't a terribly meaningful figure, since capacity also depends on record size, query complexity, and data retention windows. In any case, the vendor is aiming to support one million records per second.

SQLStream was founded in 2002 and owns some basic stream processing patents. The product itself was launched only in 2008 and currently has about a dozen customers. Since the company is still seeking to establish itself, pricing is, in their words, “very aggressive”.

If you’re still reading this, you probably have a pretty specific reason for being interested in SQLStream or stream processing in general. But just in case you’re wondering “Why the heck is he writing about this in a marketing blog?” there are actually several reasons. The most obvious is that “real time analytics” and “real time interaction management” are increasingly prominent topics among marketers. Real time analytics provides insights into customer behaviors at either a group level (e.g., trends in keyword response) or for an individual (e.g., estimated lifetime value). Real time interaction management goes beyond insight to recommend individual treatments as the interaction takes place (e.g., which offer to make during a phone call). Both require the type of quick reaction to new data that stream processing can provide.

There is also increasing interest in behavior detection, sometimes called event driven marketing. This monitors customer behaviors for opportunities to initiate an interaction. The concept is not widely adopted, even thought it has proven successful again and again. (For example, Mark Holtom of Eventricity recently shared some very solid research that found event-based contacts were twice as productive as any other type. Unfortunately the details are confidential, but if you contact Mark via Eventriicty perhaps he can elaborate.) I don’t think lack of stream processing technology is the real obstacle to event-based marketing, but perhaps greater awareness of stream processing would stir up interest in behavior detection in general.

Finally, stream processing is important because so much attention has recently been focused on analytical databases that use special storage techniques such as columnar or in-memory structures. These require processing to put the data into the proper format. Some offer incremental updates, but in general the updates run as batch processes and the systems are not tuned for real-time or near-real-time reactions. So it’s worth considering stream processing systems as a complement to that lets companies employ these other technologies without giving up quick response to new data.

I suppose there's one more reason: I think this stuff is really neat. Am I allowed to say that?

Tuesday, January 20, 2009

Salespeople: One Question Matters Most

Back in December, the Sales Lead Management Association and LEADTRACK published a survey on lead management practices that I haven’t previously had time to write about. (The survey is still available on the SLMA Web site.) It contained 10 questions, which is about as many as I can easily grasp.

The two clearest answers came from questions about the information salespeople want and why they don’t follow up on inquiries. By far the most desired piece of information about a lead was purchasing time frame: this was cited by 41% of respondents, compared with budget (17%), application (15%), lead score (15%) and authority (12%). I guess it’s a safe bet that salespeople jump quickly on leads who are about to purchase and pretty much ignore the others, so this finding strongly reinforces the need for nurturing campaigns that allow marketers to keep in contact with leads who are not yet ready to buy.

Note that none of listed categories included behavioral information such as email clickthroughs or Web page visits, which demand generation vendors make so much of. I doubt they would have ranked highly had they been included. Although behavioral data provides some insights into a lead’s state of mind, it's useful to be reminded that wholly pragmatic facts about time frame are a salesperson's paramount concern.

The other clear message from the survey was that the main reason leads are not followed up is “not enough info”. This was cited by 55% of respondents, compared with 14% for “inquired before, never bought”, 12% for “no system to organize leads”, 10% for “no phone number”, 7% for "geo undesirable" and 2% because of "no quota on product". This is an unsurprising result, since (a) good information is often missing and (b) salespeople don’t like to waste time on unqualified leads. Based on the previous question, we can probably assume that the critical piece of necessary information is time frame. So this answer reinforces the importance of gathering that information and passing it on.

One set of answers that surprised me a bit were that 77% or 80% of salespeople were working with an automated lead management system, either “CRM/lead management” or “Software as a Service”. I’ve given two figures because the question was purposely asked two different ways to check for consistency. The categories don’t make much sense to me because they overlap: products like Salesforce.com are both CRM systems and SaaS. Still, this doesn't affect the main finding that nearly everyone has some type of automated system to “update lead status” and “manage your inquires” (the two different questions that were asked). This is higher market penetration than I expected, although I do recognize that those questions deal more with lead management (a traditional sales automation function) than lead generation (the province of demand generation systems). Still, to the extent that CRM systems can offer demand generation functions, there may be a more limited market for demand generation than the vendors expect.

One final interesting set of figures had to do with marketing measurement. The survey found that 23% of companies measure ROI for all lead generation tactics, 30% measure it for some tactics, and 47% don’t measure it at all. The authors of the survey report seem to find these numbers distressingly low, particularly in comparison with the 80% of companies that have a system in place and, at least in theory, are capturing the data needed for measurement. I suppose I come at this from a different perspective, having seen so many surveys over the years showing that most companies don’t do much measurement. To me, 23% measuring everything seems unbelievably high. (For example, Jim Lenskold's 2008 Marketing ROI and Measurements Study found 26% of respondents measured ROI on some or all campaigns; the combination of "some" and "all" in the SLMA study is 53%.) Either way, of course, there is plenty of room for improvement, and that's what really counts.

Monday, January 19, 2009

New Best Practices White Paper

As promised, I've written a white paper with the 37 Marketing Automation Best Practices listed in last week's post. This is in the Resources section of the Raab Guide to Demand Generation Systems site; you have to register and then log in. Registration is free.

Saturday, January 17, 2009

Best Practices for Marketing Automation and Demand Generation Campaigns

I enjoyed my little presentation on BrightTalk last Wednesday, which you can still view by clicking here. (If that doesn’t work, go to the BrightTalk site and key my name into the site search function. This will also bring up a roundtable discussion from Tuesday, which I think was interesting as well.) The BrightTalk platform itself worked nicely and was about as simple as possible. They offer a limited version for free (one 30 minute Webinar per month), which is worth considering if you’d like to dip your toe into this sort of thing. The next level up is $949 per month, which is rather pricey compared with $99 per month for Go To Webinar, a platform we’ve used here which does roughly the same thing. I'm not saying they're identical: BrightTalk lets you upload your slides rather than sharing the screen of your PC, which makes it more reliable, and seems to offer some promotional services too. So you’d want to look more closely at both paid services before making a choice.

But I digress. The heart of my presentation on Wednesday ended up as a list of 37 “best practices” for marketing automation / demand generation programs. I’ll probably embed them in a white paper for the Raab Guide Web site in the near future, but for now I thought I’d share them here. (If you want the full slide deck, complete with moderately witty speaker notes, drop me at email at draab@raabassociates.com.)

A bit of context: the presentation listed a sequence of steps for marketing campaign creation, deployment and analysis. The best practices are organized around those steps.

Step 1: Gather Data. The marketer assembles information about the target audience. Best practices here involve the types of data, and, in particular, expanding beyond traditional sources.
• leads, promotions, responses, orders: these are the traditional data sources used in most marketing systems. Best practices would link actual orders back to the individual leads, and do accurate customer data integration.

• external demographics, preferences, contact names: the best practice here is to supplement internal data with external sources such as D&B, Hoovers, LexisNexis, ZoomInfo, etc. More information allows more accurate targeting and better lead scoring.

• social networks: these can be another source of contact names, and sometimes of introductions via mutual friends. A close look at what individuals have said and done in these networks could provide deep insight into a particular person’s needs, attitudes and interests, but this is more an activity for salespeople than marketers.

• summarized activity detail: marketing systems gather an overwhelming mass of detail about prospect activity, down to every click on every Web page. Best practice is to make this more usable by flagging summaries such as “three visits in the past seven days” and making them available for segmentation and event-triggered marketing.

• self-adjusting surveys: once a lead has answered a survey question, the system should automatically replace that question with new one. This builds a richer customer profile and avoids annoying the lead by asking the same question twice. For a bonus best practice, the system should choose the next question based on user-defined rules that select the most useful question for each individual.

• order detail, payments, customer service: the best practice is to gather information beyond the basic order history from operational systems. This also allows more precise targeting and may uncover opportunities that are otherwise invisible, such as follow-up to service problems.

• near real-time updates: fast access to information about lead behaviors allows quick response, particularly in the form of event-triggered messages. This can be critical to engaging a prospect when her interest is at its peak, and before she turns to a competitor.

• household and company levels: consumers should be grouped by households and business leads by their company, division or project. This grouping permits selections and scoring based on activity of the entire group, which may display patterns that are not visible when looking at just a single individual.

Step 2: Design Campaign. The marketer now designs the flow of the campaign itself. Traditional marketing programs use a small number of simple campaigns, each designed from scratch and often used just once. Even traditional campaigns often include multiple steps, so this itself isn’t listed separately as a new best practice.

• many specialized campaigns: the best practice marketer deploys many campaigns, each tailored to a specific customer segment or business need. These are more effective because they are more tightly targeted.

• cross sell, up sell and retention campaigns: demand generation focuses primarily on campaigns to acquire new leads. The best practice is to supplement these with campaigns that help sell more to existing customers and to retain those customers. Marketing automation has generally included these types of campaigns, at least in theory, but many firms could productively expand their efforts in these areas.

• share and reuse components (structure, rules, lists): when marketers are running many specialized campaigns, they have greater opportunity to share common components, and greater benefit from doing so. Sharing makes it possible to build more complex, sophisticated components and to ensure consistency both in how each customer is treated and in how company policies are implemented.

• new channels (search, Web ads, mobile, social): these new channels are often more efficient than traditional channels, and many have other benefits such as being easier to measure. Best practice marketers test new channels aggressively to find out what works and how they can best be used. Even if the new channels are not immediately cost-effective, marketers can limit their investment but still build some experience that will be useful later.

• multiple channels in same campaign: true multichannel campaigns contact customers through different media. A mix of media allows you to reach customers who are responsive in different channels, thereby boosting the aggregate response. Channels may also be chosen based on stated customer preferences and the nature of a particular contact. Marketing automation systems make it easy to switch between media within a single campaign.

Step 3: Develop Content. This step creates the actual marketing materials needed by each step in the campaign design. These are emails, call scripts, landing pages, brochures, and so on.

• rule-selected content blocks and best offers: content is tailored to individuals not simply by inserting data elements (“Dear [First_Name]” but by executing rules that select different messages based on the situation. For example, a rule might send different messages based on the customer’s account balance.

• map drip-marketing message to buyer stage: best practice nurturing campaigns deliver messages that move the lead through a sequence of stages, typically starting with general information and becoming more product oriented. This is more effective than sending the same message to everyone or always sending product information.

• standard templates: messages are built using standard templates that share a desired look-and-feel and contain common elements such as headers and footers. This provides consistency, saves work, and ensures that policies are followed.

• share and reuse components (items, content blocks, images): like shared campaign components, shared marketing contents minimize the work needed to create many different, tailored campaigns. Sharing also makes it easy to deploy changes, such as a new price or new logo, without individually modifying every item in every campaign.

• unified content management system across channels: even though most marketing materials are channel-specific, many components such as images and text blocks can in fact be shared across different channels. Managing these through a single system further saves work, supports sharing, and ensures consistency.

Step 4: Execute Campaign. The campaign is deployed to actual customers. Best practice campaigns often run continuously, rather than being executed once and then replaced with something new. This lets marketers refine them over time, testing different treatments for different conditions and keeping the winners.

• separate treatments by segment: messages and campaign flows are tailored to the needs of each segment. This could be done by creating one campaign with variations for different segments or by creating separate campaigns for each segment. Which works best depends largely on your particular marketing automation system. Either way, shared components should keep the redundant work to a minimum.

• statistical modeling for segmentation: predictive model scores can often define segments more accurately than manual segmentations. Perhaps more important, they can be less labor-intensive to create, allowing marketers to build more segments and rebuild them more often. This matters because best practice marketing involves so many specialized campaigns and is constantly adjusting to new conditions.

• change campaign flow based on responses, events, activities: best practice campaigns change lead treatments in response to their behaviors. Thus, instead of a fixed sequence of treatments, they send leads down different branches and, in some cases, move them from one campaign to another. Changes may be triggered by activities within the campaign, such as response to a message or data provided within a form, or by information recorded elsewhere and reported to the marketing automation system.

• advanced scoring (complex rules, activity patterns, event depreciation, point caps): simple lead scoring formulas are often inaccurate predictors of future behavior. Best practice scoring may involve complex calculations based on relationships among several data elements, summarized activity detail, reduced value assigned to less recent events, and caps on the number of points assigned for any single type of activity. A related challenge for system designers is making complex formulas reasonably easy to set up and understand.

• company-level scores and activity tracking: the best practice campaign can use aggregated company or household data to calculate scores, guide individual treatments, and issue alerts. This allows more appropriate treatment than looking at each individual in isolation.

• multiple scores per lead: for companies with several products, the best practice to calculate a separate lead score for each. The scores may also have different thresholds for sending the lead to sales.

• define score formula jointly with sales: the salesperson is the ultimate judge of whether a lead is qualified. But many marketing departments still set up lead scoring formulas without sales input. Best practice is to work together on defining the criteria and then to periodically review the results to see if the formula can be improved.

• let sales return leads for more nurturing: traditional lead management is a one-way street, with leads sent from marketing to sales and then never heard from again. Best practice marketers allow salespeople to return leads to marketing for further nurturing. This improves the chances of a lead ultimately making a purchase, even if it doesn’t happen right away.

Step 5: Analyze Results. Learning from past campaigns may be the most important best practice of all. Having many targeted campaigns allows for continuous incremental improvement, achieved by quickly evaluating the return on each project and adjusting future programs based on the results.

• advanced response attribution: traditional methods often credit a lead to whichever campaign contacted them first, or whichever generated the first response. Best practice marketers look more deeply at the factors which may have influenced a lead’s behavior, often applying sophisticated analytics to estimate the incremental impact of different campaigns.

• standard metrics, within and across channels: resources can only be allocated to their optimal use if return on investment can be compared across campaigns. This requires standard metrics, which must be calculated consistently and clearly understood throughout the organization.

• formal test designs (a/b, multivariate): traditional marketers often do little testing, and the tests they do are often poorly designed. Best practice marketing involves continuous, formal testing designed to answer specific questions and lead to actionable results.

• capture immediate and long-term results: initial response rate or cost per lead fails to take into account the value of the leads generated, which can differ hugely from campaign to campaign. Best practice requires measuring the long-term value and building it into standard campaign metrics.

• evaluate on customer profitability, not revenue: customers with the same revenue can vary greatly in the actual profit they bring to the company, depending on the profit margins of their purchases and other costs such as customer support. Best practice metrics include accurate profitability measures, preferably drawn from an activity-based costing system.

• continually assess and reallocate spending: best practice marketers have a formal process to shift resources to the most productive marketing investments. These will change as campaigns are refined, business conditions evolve, and new opportunities emerge. A formal assessment process is essential because organizations otherwise tend to resist change.

Infrastructure. Individual campaigns are made possible by an underlying infrastructure that has best practices of its own.

• consolidated systems (multi-channel content management, campaign management and analytics): today’s marketing systems can usually handle multiple channels, so decommissioning older channel-based systems may save money as well as making multi-channel campaigns easier to execute. Consolidated multi-channel analytics, which may occur outside of the marketing automation system, are particularly important for gaining a complete view of each customer.

• advanced system training: marketing departments often provide workers with the minimum training needed to gain competency in their tools. Best practice departments recognize that additional training can make users more productive, particularly as the tools themselves add new capabilities that users would otherwise not be able to exploit.

• advanced analytics training: analytics play a central role in the continuous improvement process. Solid analytics training ensures that users can set up proper tests and interpret the results. Because data and tools are often already available, lack of training is frequently the main obstacle that prevents marketers from using analytics effectively.

• formal processes: best practice marketers develop, document and enforce formal, consistent business processes. This both ensures that work is done efficiently and makes it possible to execute changes when opportunities arise.

• cross-department cooperation: working with sales, service, finance and other departments is essential to sharing systems, data and metrics. A cross-department perspective ensures that each department considers the impact of its decisions on the rest of the company and on the customers themselves.

Summary

The best practice vision is many marketing campaigns, each precisely targeted, efficiently executed, and carefully designed to yield the greatest possible value. The campaigns are supported by detailed analysis to understand results and identify potential improvements. This information is quickly fed into new campaigns, ensuring that the company continually evolves its approaches and makes the best possible use of marketing resources. Continuous optimization is the ultimate best practice that all other practices should support.

Thursday, January 08, 2009

Company-Level Data in Demand Generation Systems

I had an interesting email conversation last month with a Raab Guide buyer about the nuances of company-level data management in demand generation systems. He started from the perfectly reasonable premise that the demand generation system should give an overview of activity for all leads associated with a given company. This is a topic we do cover in the Guide, but not in the detail he needed. The conversation has sharpened my own thinking on the subject and prompted a couple of conversations with vendors. I think it’s worth discussing here in some detail.

There are really two separate issues at play. The first is how the demand generation system treats company-level data. This, to start at the very beginning, is data about the company associated with an individual. Typically this is the company they work for, although there might be another relationship such as consultant or services vendor. From a database design standpoint, having one company record that is linked to multiple individuals avoids redundant data and, therefore, potential inconsistencies between company data for different people. (The technical term for this is “normalization”, meaning that the database is designed so a given piece of information is stored only once.) Some of the demand generation systems do indeed have a separate company table: it is one of the marks of a sophisticated design.

But the wrinkle here is that most CRM systems in general, and Salesforce.com in particular, also have separate company and individual levels. In fact, Salesforce.com actually has two types of individuals: “leads” which are unattached individuals, and “contacts”, which are individuals associated with an “account” (typically a company, although it might a smaller entity such as a division or department). Most demand generation systems make little distinction between CRM “leads” and “contacts”, converting them both to the same record type when data is loaded or synchronized.

The common assumption among demand generation vendors is that the CRM system is the primary source of information for which individuals are associated with which companies and for the company information itself. This makes sense, given that the salespeople who manage the CRM data are much closer to the companies and individuals than the marketers who run the demand generation system. Demand generation systems therefore generally import the company / individual relationships as part of their synchronization process. Systems with a separate company table store the imported company data in it; those without a separate company table copy the (same) company data onto each individual record. So far so good.

However, this raises the question of whether the demand generation system should be permitted to override company / individual relationships defined in the CRM system or to edit the company (or, for that matter, individual) data itself. I have to get back to the vendors and ask the question, but I believe that most vendors do NOT let the demand generation system assign individuals to companies or change the imported relationships. (In at least some cases, users can choose whether or not to allow such changes.) Interestingly enough, these limits apply even to systems that infer a visitor’s company from their IP address and show it in reports. Whether demand generation can change company-level data and have those changes flow back into the CRM system is less clear: again, it may be matter of configuration in some products. The vendors who don’t provide this capability will argue, probably correctly, that few marketers really want to do this and or in fact should do it.

So what information DOES originate in the demand generation system? Basically, it is new individuals, which correspond to “lead” records in Salesforce.com, and attributes for existing individuals, which may relate to either CRM “leads” or “contacts”. The demand generation system may also capture company information, but this is stored on the individual record and kept distinct from the company information in the CRM system. When a new individual is sent from demand generation to the CRM system, it is set up in CRM as a “lead” (that is, unattached to a company). The CRM user can later convert it to a contact within an account. But the demand generation system cannot generally set up accounts and contacts itself. (Again, let me stress that there may be some exceptions—I’ll let you know when I find out.)

Bottom line: leads, lead data and contact data may originate in either demand generation or CRM, and the synchronization is truly bi-directional in that changes made in either system will be copied to the other. But accounts and account / contact relationships are only maintained in the CRM system: most demand generation systems simply copy that data and don’t allow changes. Thus, it is essentially a unidirectional synchronization.

This leads us to the second issue, which is company-level reporting. It’s touted by most demand generation vendors, but a closer look reveals that some actually rely on the account-level reporting in Salesforce.com to deliver it. This isn’t necessarily a problem, since the Salesforce.com report can incorporate activities captured by the demand generation system. Per the earlier discussion, these activities will be linked to individuals (leads or contacts in Salesforce.com terms). Of course, if the individuals have not been linked to a company in the CRM system, they cannot be included in company-level reports.

One problem with relying on CRM for company-level reporting is that the demand generation system may have some nice reporting capabilities that you’d want to use. So there is some advantage to capturing the company / individual links within the demand generation system, even if the links themselves are created in CRM. (Not to get too technical here, but those “links” could simply be a company ID on the individual record, even if there is no separate company-level table in the data model. So lacking a company table does not preclude company-level reporting.)

A second issue is that some marketers want to use company-level information in their lead scoring calculations. This is another question we ask explicitly in the Guide, and only some companies say they can do it. Whether they make it simple is another question—some products who claim the facility in fact require considerable effort to make it happen. Again, the vendors who don’t offer this would presumably say that it isn’t very important to their clients.

I hope this clarifies both the mechanics of handling company-level data and some of underlying business issues. It’s one of those aspects of demand generation systems where vendor differences are real but subtle, and where the true importance of those differences is clear only to experienced users. All I can do in the Guide and this blog is try to describe things as clearly and accurately as possible, and thereby help you to make sound judgments about your own business needs.

Tuesday, January 06, 2009

Raab Marketing Automation Webinar on January 14

Just a brief note to let you know that I'll be giving a Webinar on "How to Get the Most Value from Your Marketing Automation System" on Wednesday, January 14 at 4 p.m. Eastern / 1 p.m. Pacific. You can register at http://www.brighttalk.com/webcasts/2017/attend This is part of a two-day Marketing Automation Summit organized by BrightTALK, a vendor of webcasting technology that sponsors channels of specialized content, as well as running regular online conferences. As near as I can tell, BrightTALK makes its money selling the technology and related services: according to their Web site, you can have your own "channel" with unlimited webcasts for $949 per month. Whether their conferences are simply a way to promote this or have another revenue stream attached, I don't know.

In any case, my presentation on the 14th will look at short-term ways of getting value from existing systems, on the theory that money for new systems will be scarce in the near future. This is a bit of a switch from my usual focus on system selection. But system selection always boils down to how the system will be used, so it isn't much of a stretch.

Tune in if you have a chance, and if you're, say, recovering from a skiing accident over the holidays, you can listen to the full lineup of sessions starting on January 13. I'll also be participating in a round table discussion at 7 p.m. Eastern / 4 p.m. Pacific on the 13th.