Microsoft SharePoint and .NET Technology Insight

SharePoint, .NET, Office365, Windows Azure, TFS, Project Server and SQL Server Technology Insights


Leave a comment

SharePoint 2010 and Host Named Site Collections

A host header is a third piece of information that you can use in addition to the IP address and port number to uniquely identify a Web domain or, as Microsoft calls it, an application server. In SharePoint 2010, host headers can be applied at two different levels :

The Web application (IIS Web site) level
The site collection level

A host-named site collection allows you to address a site collection with a unique DNS name. A SharePoint Web Application when created contains many path-based site collections that share the same host name (DNS name). For example, Team A has a site collection at http://abc.com/sites/teamA, and Team B has a site collection at http://abc.com/sites/teamB. These are referred to as path-based site collections, and is the recommendation for most corporate scenarios. Host named site collections enable you to assign a unique DNS name to site collections. For example, you can address them as http://TeamA.abc.com and http://TeamB.abc.com allowing for scalability. This is a feature of SharePoint that allows individual site collections to have their own top-level URL.

The following code snippet programmatically creates the host-named site collection with the URL https://sharepointzen.wordpress.com in the SharePoint Server 2010 Web application with the URL https://sharepoint

SPWebApplication webApp = SPWebApplication.Lookup(new Uri("http://sharepoint"));
SPSiteCollection sites = webApp.Sites;
SPSite Site = sites.Add(“https://sharepointzen.wordpress.com”, "Test",
"Test Site", 1025, "STS#0", "domain\abc","Arshad Riz", “ariz@test.com”, "domain\abc1”,"Ak Riz", akriz@test.net, true);

PowerShell can also be used in place of server object model code. Please refer to : http://technet.microsoft.com/en-us/library/cc424952.aspx


Leave a comment

Invoking SharePoint Workflows manually from a Custom Timer job

I recently had a requirement to create a custom Timer job in SharePoint 2010 to manually invoke a SharePoint Custom Visual Studio developed Workflow associated with a SharePoint list. See the following code snippet for additional details.

public class ListItemWorkflowProcessingJob : SPJobDefinition
{
     enum ProcessingStatus
     {
          NotCompleted,
          Completed
     };

     public ListItemWorkflowProcessingJob()
     {
     }

     public ListItemWorkflowProcessingJob(string jobName, SPWebApplication webApplication, SPServer server, 
     SPJobLockType targetType) : base(jobName, webApplication, server, targetType)
     {
         Title = "List Item Workflow Processing timer job";
     }

     public ListItemWorkflowProcessingJob(string jobName, SPWebApplication webApplication)
     : base(jobName, webApplication, null, SPJobLockType.Job)
     {
         Title = "List Item Workflow Processing timer job";
     }

     public override void Execute(Guid contentDbId)
     {
        SPWebApplication webApplication = this.Parent as SPWebApplication;
        processListItemWorkflows(webApplication);
     }

     public void processListItemWorkflows(SPWebApplication webApplication)
     {
         using (SPSite site = webApplication.Sites["/"])
         {
          #region Action Delegates
          Action<SPListItem> CancelWorkflows = (ListItem) =>
          {
               SPWorkflowCollection workflows = ListItem.Workflows;
               foreach (SPWorkflow workflow in workflows)
               {
                   SPSecurity.RunWithElevatedPrivileges(() 
                           => SPWorkflowManager.CancelWorkflow(workflow));
               }

          };

          Action<SPListItem> StartWorkflow = (ListItem) =>
          {
               // ID value from Workflow elements.xml file
               Guid wfBaseId = new Guid("54a15d07-f2d7-4626-b82d-5a853802ad0e");
               var wfa = (from SPWorkflowAssociation spwfa in 
                          ListItem.ParentList.WorkflowAssociations
                          where spwfa.BaseId == wfBaseId && spwfa.Enabled == true
                          select spwfa).FirstOrDefault();
               site.WorkflowManager.StartWorkflow(ListItem, wfa, wfa.AssociationData, true);
          };


              #endregion Action Delegates

              site.AllowUnsafeUpdates = true;
              SPList MyList = site.RootWeb.Lists["Custom List"];

              var listItems = from SPListItem item in MyList.Items
                              orderby item.Title
                              ascending
                              where (string)item["ProcessingStatus"].ToString()
                               == ProcessingStatus.NotCompleted.ToString()
                              && item["ProcessingStatus"] != null
                              select item;

              foreach (SPListItem ListItem in listItems)  //each item in the list
              {
                 try
                 {
                     CancelWorkflows(ListItem);
                     ListItem.Update();
                     try
                     {
                        StartWorkflow(ListItem);
                     }
                     catch (Exception ex)
                     {
                         throw ex;
                     }

                     ListItem["ProcessingStatus"] = FIAProcessingStatus.Completed;
                     ListItem.Update();

                    //wait for 10 seconds
                     Thread.Sleep(10000);

                  }
                  catch (Exception ex)
                  {
                    throw ex;
                  }
                }

                MyList.Update();
                site.AllowUnsafeUpdates = false;
           }

        }
}

The  above custom timer job cancels any existing workflow instances running on SharePoint ListItems before manually invoking a new workflow instance. The Startworkflow Action<> Delegate invokes a new workflow instance on a SharePoint list item by using the workflow ID GUID value specified inthe workflow solution elements.xml file.


Leave a comment

Dynamically updating markup within a SharePoint 2010 Content Editor Web-Part (CEWP) using JavaScript

Here is an example of dynamically updating URL links within a SharePoint 2010 Content Editor Web-Part (CEWP) and reading the values from a SharePoint list using JavaScript.


<script type="text/javascript">

var clientContext = null;
var web = null;
ExecuteOrDelayUntilScriptLoaded(GetMyTimeSheetURL, "sp.js");

function GetMyTimeSheetURL()
{
 clientContext = new SP.ClientContext.get_current();
 web = clientContext.get_web();
 var list = web.get_lists().getByTitle('Popular Links');

var camlQuery = new SP.CamlQuery();
 camlQuery.set_viewXml('<View><Query><Where><Eq><FieldRef Name="Title"/><Value Type="Text">My Timesheet</Value></Eq></Where></Query><ViewFields><FieldRef Name="LinkUrl" /></ViewFields></View>');

this.listItems = list.getItems(camlQuery);
 clientContext.load(listItems);
 clientContext.executeQueryAsync(Function.createDelegate(this, this.onListItemsLoadSuccess),
 Function.createDelegate(this, this.onQueryFailed));

}

&nbsp;

function onListItemsLoadSuccess(sender, args) {
 var enumerator = this.listItems.getEnumerator();
 while (enumerator.moveNext()) {
 var item = enumerator.get_current();
 var url = item.get_item('LinkUrl')
 document.getElementById("contentDiv").innerHTML ='<li><a href= ' + url + '>My TimeSheet</a></li>';
 }
}
function onQueryFailed(sender, args) {
alert('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());}

</script>

<div class="resources-box"><h3>Key Resources</h3>
<div id="contentDiv"><li><a href="#">My TimeSheet</a></li></div></ul></div>

</div>

The above markup can be added into a CEWP and utilizes the SharePoint 2010 JavaScript Client Object Model Features to dynamically update the URL links within the web-part based on values in a SharePoint list.  The JavaScript delegate onListItemsLoadSuccess does not return a value which is why the above code utilizes document.getElementById(“contentDiv”).innerHTML to update the markup for the HREF tag.


1 Comment

Using SPLongOperation for Synchronous Long running tasks in SharePoint

The Microsoft.SharePoint.SPLongOperation class is useful when providing consistent user feedback to users during long running synchronous operation. The documentation tells us that this class “Sets the Web page image to the image used by the server to indicate a lengthy operation (typically, an animated image with associated text)”.

For example, consider a scenario in which a SharePoint Web-Part page is used to add users to an External system by using a WCF service. The add user process maybe a lengthy operation based on the available business logic within the external system and can be wrapped in a SPLongOperation by calling the static SPLongOperation.Begin Method (String, String, SPLongOperation.BeginOperation) where the SPLongOperation.BeginOperation delegate specifies the delegate method that will begin the long operation. There are 2 string parameters which set the SPLongOperation.LeadingHTML and SPLongOperation.TrailingHTML properties (what is displayed as “the associated text”). When the call to the WCF service is completed successfully, the operation is ended by calling either the End or EndScript method, based on whether the window was opened as a dialog or not which can be verified if the query string parameter “IsDlg”, which will be equal to 1 if it is a dialog window. Here is an example for using SPLongOperation:

try
{
    SPLongOperation.Begin(
        "Calling WCF Service for adding user into External System",
        "Please wait for this process to complete. This may take a few seconds.",
        delegate(SPLongOperation longOp)
        {
            try
            {
                //Code for calling the WCF service for adding a user into an External System goes here

                // Now end the long operation
                if (Context.Request.QueryString["IsDlg"] != null)
                {
                    longOp.EndScript(String.Format(CultureInfo.InvariantCulture,
                        "<script type=\"text/javascript\">window.frameElement.commonModalDialogClose({0}, {1});</script>",
                        1, String.Format("\"{0}\"", returnValue)));
                }
                else
                {
                    string url = Context.Request.UrlReferrer.AbsoluteUri;
                    longOp.End(url,
                       Microsoft.SharePoint.Utilities.SPRedirectFlags.DoNotEndResponse,HttpContext.Current,"");
                }
            }
            catch (ThreadAbortException) { /* thrown when redirected */ }
            catch (Exception ex)
            {
                string exMessage = ex.InnerException != null ? ex.InnerException.Message : ex.Message;
                string message = String.Format("Error occurred whilst deleting relationship: {0}", exMessage);
                RedirectToErrorPage(message);
            }
        });
}
catch (ThreadAbortException) { /* thrown when redirected */ }
catch (Exception ex)
{
    string exMessage = ex.InnerException != null ? ex.InnerException.Message : ex.Message;
    string message = String.Format("Error occurred when calling WCF Service add a user into External System: {0}", exMessage);
    RedirectToErrorPage(message);
}

//The RedirectToErrorPage(message) method calls the SPUtility.TransferToErrorPage method
private void RedirectToErrorPage(string message)
{
    if (string.IsNullOrEmpty(this.Referrer))
    {
        SPUtility.TransferToErrorPage(message);
    }
    else
    {
        SPUtility.TransferToErrorPage(String.Concat(message, " {0}."), "Return to calling Web-Part page.", this.Referrer);
    }
}

Additional information on SPLongOperation can be found here:

http://blog.symprogress.com/2010/10/close-modal-dialog-when-splongoperation-finish/

http://dotnetfollower.com/wordpress/2011/08/sharepoint-how-to-use-splongoperation/


Leave a comment

SharePoint 2010 Diagnostic and Usage Analysis Logs best practice

SharePoint 2010 has improved the way the ULS logs information and now these diagnostic logging data can be written to trace files, Windows Event Viewer and a new SharePoint reporting database (new in SharePoint 2010). Trace log files are created in a Log folder under the root folder where SharePoint has been installed by default; that is also called the 14 Hive.

SharePoint diagnostic logging is very important, and extremely helpful when we encounter problems with our SharePoint environment. However, diagnostic logging can be ineffective at times, and can even cause SharePoint performance to slow down if it’s not managed properly. The one thing you should ABSOLUTELY do is move the ULS logs off of the system drive. ULS is designed to stop logging if it percieves a disk space issue and moving the logs off of the system drive ensures that logging isn’t going to fill up the system drive, and that ULS logging isn’t going to contend with your page file for disk IO. Note that in order to change the location fo the log file, the path must exist on ALL SharePoint servers, and the folder’s permissions must be set to allow the SharePoint service to write to it.

There are two sets of logs you want to move in the SharePoint 2010 farm environment, the diagnostic logs and the usage logs.

Diagnostic logs:

With Central Admin:

Central Admin > Monitoring > Configure Diagnostic Logging (/_admin/metrics.aspx). The setting is the “Trace Log” path at the bottom. It is recommended changing the Drive letter and leaving the rest of the path alone. It’ll make it easier for you to find things later on. 

With PowerShell:
You can also use PowerShell to change this. The cmdlet is Set-SPDiagnosticConfig and the parameter is –LogLocation.

Set-SPDiagnosticConfig -LogLocation “E:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS

Usage logs:

With Central Admin:

Central Admin > Monitoring > Configure web analytics and health data collection (/_admin/LogUsage.aspx). The setting is the “Log file location” setting. Set it to the same path you did the Trace Log above.

With PowerShell:

The PowerShell cmdlet to alter this is Set-SPUsageService and the parameter is –UsageLogLocation.

Set-SPUsageService -UsageLogLocation “E:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS\”


1 Comment

SharePoint 2010 Storage Planning and Performance

SharePoint is unique in that it serves a broad range of use cases. This single platform can function as a relatively static content management system with modest storage requirements, a collaboration system with significant storage requirements, or even a large-scale document repository with extreme storage requirements, all in the same farm implementation! It only takes one of these functional requirements to get the SharePoint movement started in an organization. Problems can arise, however, when storage requirements are not properly planned for and the business begins to expand the original purpose of the SharePoint implementation. Storage performance is particularly important in the SharePoint ECM solution due to document archive requirements. It is often necessary to store documents for seven years or more. This can be a challenge for organizations that have high document volumes resulting from business processes with partners, vendors, and customers.

A good rule of thumb for large content repositories that have a high request per second (RPS) requirement of several hundred or more is to have 2 IOs per second (IOPS) per GB of content. For example, if the content database size is around 1.5 TB, the IOPS requirement would be approximately 3,072. If the content storage requirement were lower or the RPS requirement is lower, then the rule of thumb or “starting point” IOPS guidance could be adjusted downward.

Sample scenario:

  • Consider a scenario with around 1.5 TB of SharePoint 2010 content that have an RPS of around 2500 to 2800 which needs to be managed on a shared SAN:

 

  1. If the content is managed on five 15,000 RPM SAS drives, each 600GB in size, to build a 2.4TB RAID 5 disk array in the shared storage area network (SAN), the drives would provide between 700 and 1,000 IOPS depending on the caching capabilities of the SAN. The content storage requirements are adequately provided by this configuration but performance is impacted due to the low IOPS.

 

  1. If the content is managed on sixteen 15,000 RPM SAS drives that are 146GB in size, the infrastructure team can build two, eight-drive RAID 5 arrays and divide the content databases between them. This would theoretically yield two 1TB arrays with something like 2,500 to 2,800 combined IOPS capability providing adequate performance for the SharePoint environment.

 

The above solution recommendation of having the additional disks is significantly more expensive than using fewer, larger disks, but it will be more expensive in the long run if end users are waiting for content retrieval or, worse, they tire of waiting and stop using the system entirely. The significant architecture point to remember here is that given the same size disk array, the array with more disks of a smaller size will significantly outperform an array with fewer disks of a larger size. However, another key consideration is the use of Shared SAN or Direct Attached Storage as outlined below.

The SharePoint 2010 platform disk requirements go far beyond the content database to include application service databases like those used by the Enterprise Search service applications, which require extensive IOPS support. Shared SAN storage is often difficult to properly allocate because each of the systems that use the shared storage often have peak usage times that coincide. A Shared SAN can impact SharePoint performance as the SAN solution may be set up with a large storage base consisting of numerous disks intended for use as shared storage for several products including Microsoft Exchange, Microsoft SharePoint, other Microsoft SQL Servers and file shares impacting the overall IOPS. The allure of the SAN is cost, which can be shared across cost centers. It is typically easier to justify the purchase of a single large SAN that would service multiple departments than it is to justify a quality DAS system that serves only SharePoint.

 

Direct-attached storage (DAS) is any storage that is directly attached to the server, typically by SCSI connection or Fibre Channel. The biggest benefit of DAS is the guarantee of unshared storage and direct control over configuration of the disk array. This usually results in a much more predictable performance pattern in terms of supplying SharePoint with all the IOPS it needs to efficiently serve users. While DAS vs. SAN pros and cons can always be debated, in most cases DAS is the preferred storage technology for SharePoint — regardless of whether it is installed on physical servers or in a virtual machine environment.

See links below for additional information on SharePoint 2010 Storage architecture:

Requests Per Second Required for SharePoint Products and Technologies:

http://blogs.technet.com/b/wbaer/archive/2007/07/06/requests-per-second-required-for-sharepoint-products-and-technologies.aspx

Storage and SQL Server capacity planning and configuration

http://technet.microsoft.com/en-us/library/cc298801.aspx

http://blogs.technet.com/b/rycampbe/archive/2011/08/23/virtualization-the-san-and-why-one-big-raid-5-array-is-wrong.aspx


Leave a comment

Programmatically adding images on SharePoint WebPart Pages

There are multiple ways you can add images programmatically on the SharePoint web part pages from the code. This article will provide you couple of code snippets to add images using Image Web Part and Content Editor Web Part. One of the benefits of adding images using web parts are end-users can specify different images.

Image Web Part
If you want to add non-clickable image as a place holder on the web part pages, use “ImageWebPart” to add image programmatically using following code snippet.

Content Editor Web Part with Hyperlink HTML
If you want to add clickable image on the web part pages, unfortunately “ImageWebPart” wouldn’t work. As a workaround, you can add “ContentEditorWebPart” programmatically and add hyperlink HTML as shown in following code snippet.