Considerations for Web Builds

Considerations for Web Builds

The primary question for an internet build is one of scaleability. At what point will it be necessary to have Enterprise-level architecture and traffic capacity?

Enterprise-level internet presence is met when a company’s web site reaches at least one of three levels of activity. The first is the number of unique visitors it receives. The second is the number of company employees who spend their time entirely on the internet. The third is the number of public and private pages within a site. The bar is set at 1.5 million unique visits per month, 200 employees who work full-time on the internet, or at least 10,000 pages (both public and private).

Examples of private pages would include the login sequence, e-commerce checkout back-end, database access, maintenance, and levels (set by permissions) of information retrieval for generated reports.

Choose a domain name. Most often, this is an aspect of company branding. In the least, one’s web presence should maintain brand consistency with one’s established brand. Regardless of whether one is choosing a domain name for a new brand/product/service or creating a web presence for an existing one, attention should be given to the Keyword Effectiveness Index (KEI) value for the domain name as well as appropriate keywords throughout the site. To maximize Search Engine Optimization (SEO), it is best to choose a domain name with a high KEI value and one that is lexically associative to the genre of one’s brand, product, or service.

Once a domain name has been chosen, there are 15 separate steps to consider in creating an Enterprise-level web site. While these should be considered in order, completion of all the steps will overlap.

  1.  Platform of the host: Unix, Windows, Mac, or Solaris
  2.  Dynamic or static web site
  3.  HTML or XHTML
  4.  Site architecture/Database language
  5.  Keyword Effectiveness Index (KEI)
  6.  Content Management System (CMS) and Digital Asset Management (DAM)
  7.  Session identification tags
  8.  Click-track tags, Voice of the Customer (VOC), Behavioral models
  9.  Define: “success events”, i.e. converted links
  10.  Template design/User Interface (UI)
  11.  Placement of meta tags, alt and title tags, and content that is populated with long tail keyword phrases that have high KEI values.
  12.  Development of content
    1.  Writing
    2.  Photo
    3.  Video
    4.  Sound
    5.  RSS feeds, Email blast, Downloadable content
    6.  All content must have proper use rights licensing
  13.  World Wide Web Consortium’s (W3C) validation of code
  14.  Web 2.0 and Social Media/Blogging
  15.  Submit to Search Engines
  16.  The platform of the host is relevant to security against both hackers and viruses. The platform, however, may limit the languages available for the web site architecture.
  17.  If a site will be static—meaning that no pages will be populated with content from a database—either HTML or XHTML can be used to build its pages. If in HTML, style attributes are assigned with Cascading Style Sheets (CSS). If a site is built in XHTML, style attributes can be assigned using either CSS or Extensible Markup Language (XML). Increasingly, Dynamic sites—those with page content that is populated from a database, are built using XHTML.
  18.  The primary benefit of XHTML over HTML is that it permits the use of Extensible Markup Language (XML). XML permits links to objects to be scripted in such a manner that individual links aren’t required. For instance, in an e-commerce site, product photos could be placed in a single folder on the server. With XML, it’s possible to write the script to display the contents of the folder and to assign display attributes—size, border, column width, row height, etc.—whereas with Cascading Style Sheets (CSS), each individual photo would need its own link.
  19.  Regardless of the scripting language chosen for the primary site architecture for administration and intranet pages, utilize AJAX (Asynchronous JavaScript And XML) to load objects that either are repeated on every page or are often reloaded. One such example is the contents of a table on a form, particularly on an admin page when a user refines or repeats the query a number of times. AJAX minimizes the page-load times by creating what are, in effect, pages within a page.

    Choices for a scripting language for enterprise-level web sites are PHP, Python, and Ruby on Rails. The choice of a scripting language more than any other question defines aspects of scalability—the extent to which a site may need to be rewritten in a different language and/or utilize different databases.

    There are various database languages. The most used is MySQL (MsSQL for Windows). PHP, Python, and Ruby on Rails can all talk to MySQL. PostgreSQL is an alternative if it’s installed on the server.

  20.  The Keyword Effectiveness Index (KEI) is the number of times a long tail keyword phrase (long tail) is searched in a 90-day period, squared, and divided by the number of competing mentions. A KEI value above 400 is golden. A KEI value above 4 is okay. An Enterprise-level web site should only use long tails with a KEI value above 10.

    Long tails are not only for use in the Title, Description, and Keyword meta tags. They are also to be used in naming folders and files, as well as all headings and content on the page.

    Not parenthetically, choosing a domain name that has a high KEI value and a lexical context to the brand/product/service will help to maximize a web site’s Search Engine Optimization (SEO).

  21.  A Content Management System (CMS) provides a user interface for managing a site’s content. Content can be prepared and the pages built without the hands-on work of a webmaster. Text can be styled, formatted, uploaded, and laid out, and photos, video, and sound files can be uploaded and positioned. The most effective CMSs offer drag-and-drop capabilities. All objects are stored in a database.

    Levels of access and permissions can be regulated by a CMS. For example, a member of senior management could be given access to database management while an employee at the clerical level would only be given access through their login ID to do data entry, etc.

    For an e-commerce site, the CMS provides reports on its customer base. Ideally, a CMS is built to capture and return queries on what an individual client has purchased in a given period, what means of payment was used, the total volume by value or weight shipped to a certain zip code, etc.

    An additional component to a CMS is a Digital Asset Management (DAM) software. DAM is the tracking and permissions necessary to maintain a library of digital assets—photographs, video, audio clips, graphics, PDFs, etc.

  22.  Session ID tags are primarily used for security purposes. One common result of including such tags in a URL is automatic logout after a period of inactivity. However, since each visit to a web site’s pages—for both customers and search engines—generates a unique session ID tag, duplication adds up quickly. (See Stoney deGeyter’s article: Why Session IDs And Search Engines Don’t Get Along  for a more in depth discussion.

    While a CMS can remove session ID tags from a page’s URL, therefore avoiding duplication, inbound links may still provide issues. When the Robot meta tag tells a search engine to “index, follow”, URLs with session IDs will be treated as separate pages, again enabling a search engine to index duplicate content.

    A second way to not include the session IDs in the URL is to utilize a mod_rewrite script.

  23.  The purpose for installing click-track tags and Voice of the Customer (VOC) software is to track the click-though patterns of any visitors to a site and to build behavioral models with that information. In a perfect world, it’s eight clicks from when a visitor arrives at a site to completing the checkout. VOC software tracks the long tail keyword phrase that produced the click through from a Search Engine Results Page (SERP) to the text of the page on which a visitor lands. Further, it tracks their click-though pattern on the site. If a visitor leaves after only 10 seconds, the answer for which they were looking was either too obscure or not present. VOC software performs a comparison between the long tail searched and a page’s content. From the report the software generates, it’s possible to decide whether to modify the content on the page to maximize the opportunity to convert such a visitor or whether to disregard that visitor because they don’t sufficiently represent a given target demographic.
  24.  Although usually associated with e-commerce, a purchased product, or service, defining what constitutes a converted links is the key to achieving a successful site. Nonprofits may define a successful conversion as a donation, a subscription, or a pledge to take action. Service industries may define a converted link as a booked appointment, etc.

    This is the secret of a long tail keyword phrase is that if one types a single word or even two words into a search engine, the Search Engine Results Pages (SERPs) that are returned are so numerous that one is deemed to just be surfing. However, if one enters four, five, six, or more keywords into a search, the SERPs define a narrower field. Such a search indicates both the basis of a behavioral model and Voice of the Customer (VOC).

  25.  While there are an endless variety of templates and means of navigation, there are very few which consistently test well for user experience. A liquid layout with mouseover sliding menus for navigation generally test better. A liquid layout will expand and contract as a visitor drags the corner of their browser window. This feature provides a means to accommodate the growing number of people who surf the internet on small mobile devices, such as cell phones, as well as those who have large flat screen displays or a home theater system.

    There is, of course, speculation on the rate of growth of the mobile market. By March 2007, in China, the Chinese-language version of “American Idol”, titled “Super Girls”, already had more viewers among the teenage demographic that watched broadcasts on their smart phones than on TVs.

  26.  In addition to building properly populated meta tags, long tails with high Keyword Effectiveness Index (KEI) values need to populate all of a site’s folder and file names as well as the headings and page content. For dynamic pages, the URL is often created by one of three inputs. These are the Content Management System (CMS), click-track tags, and session ID tags. Often, CMS rename files and folders as well as linked objects. In all instances files, folders, and objects should be named utilizing long tails with high KEI values.
  27.  Content is king. The development of content includes all forms of media that will be placed on the site. These are the writing, photos, video, and sound files, as well as any Real Simple Syndication (RSS) feeds. All content must be commissioned and/or licensed with appropriate use rights. If any content is offered through feeds or for download, make sure the use rights include such provisions.

    The purpose for an RSS feed is to use it much the way that one uses bulletins to one’s list of friends on a social media site, or in the manner that one sends e-mail blasts to a given list of subscribers. (Always, permit the ability to opt out. Doing so is required by U.S. law.) The idea in this instance is to provide an RSS feed of announcements, features, specials, press releases, and favorable news coverage on one’s site to which any visitors can subscribe, and, in addition, to purchase a subscription to one of the search engine companions, which would parse that feed and send it to populate the world’s major search engines. Doing so makes all those announcements, features, specials, etc. searchable by GoogleYahoo!MSN, etc., in something approaching real time, rather than having to wait for the content to be indexed. (There are 8.1 billion web pages and counting. Even Google may take months to index every page of a low-traffic site.)

  28. The World Wide Web Consortium (W3C) has released drafts for (X)HTML5. For HTML, this is the first update since HTML 4.01 Transitional was introduced in December 1999. For XHTML, it’s the first update since XHTML 2.0 was introduced in August 2006.

    Because of the institutional lag time between when a new language is introduced and when the browser manufacturers release updates, none are as yet available which can read all the possible attributes available in either XHTML 2.0 or (X)HTML5. For example, Microsoft took almost five years to update Internet Explorer from 6.0 to 7.0. Consequently, as web designers/developers, we’re still all writing in HTML 4.01 Transitional, XHTML 1.0 Transitional, and XHTML 1.1 Strict.

    HTML and XHTML will be developed in unison. Style sheets for HTML must be written in a Cascading Style Sheet (CSS) format, while style sheets for XHTML may be written in either CSS or Extensible Markup Language (XML). All style attributes must now be in either CSS or XML. It is no longer permissible to place style attributes in table, tr, td, tbody, div, span, dl, dt, dd, ul, ol, li, etc. elements.

    On a day-to-day level, one of the areas where the impact will be felt is that, while (X)HTML5, the language, will be backward-compatible, new browsers which are (X)HTML5-compliant will be less forgiving of poor code. The purpose for raising the standard is to minimize issues in the future when the majority of applications will live online—eventually, even elements of our operating systems will live online.

    Below are the addresses for the W3C’s validation tools. The primary purpose for running one’s pages through a validation tool is to correct errors in the code. In short, to be compliant with W3C guidelines. Compliant pages are more likely to “play well” across different browsers.

    For example, in the first half of this year, my pages have been visited by 41 different browsers or browser versions. Of these 41, 14 have a greater than 1% market share; ten have a greater than 2% market share. Regardless of whether a page is static or dynamic (generated by a content management system with content populated from a database), developers should at least strive to write well enough to not cause issues for these ten, if not endeavoring to write well enough for all 14 browser versions that have at least a 1% market share.

    W3C HTML 4.01 and XHTML 1.0 Validation Tool

    Check both the “Group Error Messages by type” and “Verbose Output” options. Enter a full URL address, e.g.

    To use any of the tools, just paste the web address into the tool’s address window and click the “Check”, “Validate”, or “Revalidate” buttons.

    W3C CSS Validation Tool

    W3C (X)HTML5 Validation Tool

  29.  In Web 2.0, more traffic is driven from profiles on the major social media sites than from search engines. There are seven English-language social media sites with more than 15 million members. The only drawback is the volume of time required to build and maintain these profiles. Below is a link for one of the tools recommended by TechnoratiPingomatic. It’s designed for bloggers, so that if you update your blog, this tool will update the search engines you’ve requested; there are 22 available. Please note: these search engines are designed specifically for blogs and social media sites.
    The largest social media sites (all with membership above 15 million) are:

    1. MySpace
    2. Facebook
    3. Bebo
    4. Xanga
    5. hi5
    6. Cyworld
    7. Mixi (Japanese only)

    Of course, YouTube and Flickr should be mentioned for their impact on the distribution of visual media. And Twitter should be mentioned for its impact on Short Message Service (SMS) for mobile devices.

    The idea is to create a profile on each of the largest social media sites. Have a blog on each of the profiles and then use Pingomatic and similar Real Simple Syndication (RSS)-based tools to populate the relevant search engines. All of these have reciprocal links back to one’s primary web site.

    In theory, using Web 2.0 social media will drive more traffic than Google. Though when one compares the time spent in keeping up blogs and profiles, nothing beats good old organic Search Engine Optimization (SEO).

    Google is, by the way, in beta development of a search engine that will provide indexing of social media pages. This project is entitled “The Mechanical Zoo”.

  30.  Submit one’s site to the major search engines. This can be accomplished on each individual search engine’s site or through services that will submit one’s site to a multitude of search engines.
    One’s social media profiles, blogs, and Real Simple Syndication (RSS) feeds can be submitted through tools such as those provided by Pingomatic and A9.
Trending :   Keyword Effectiveness Index

Tinggalkan Balasan

Alamat email Anda tidak akan dipublikasikan.

gtag('config', 'G-SHK9C59GBG');