Reporting on the SharePoint External Lists

Today we faced an interesting issue, was the limitation technical or human or combination of both, not entirely sure about that. Thought it’s worth sharing as it definitely ties with the thought process which involves both the aspects. I am simplifying the scenario as well as abstracting the client details for obvious reasons.

The requirement goes something like this…

There are different users in the system who have different levels of access and play different roles. Some are responsible for entering the data and some of them can look through the entered data, edit them if required as well as draw reports out of them.

The technical stack consists of SharePoint as the base, InfoPath is used as the forms technology and the data from the forms finally lands into the SQL Server database in the backend. The InfoPath is connected to the external list which is using BCS in turn to get the data from the same SQL Server.

The dilemma…

There is a reporting and an analytics need on the data present in the external list. The issue is that the external lists are not easily exportable into excel, so then what is the alternative?

The solution…is it?

Since it is tied into the SharePoint thing, your brains starts to spin and yell some fancy SharePoint terms and you go SharePoint WorkSpace, SharePoint Designer and even Report Builder 3.0. The challenge is that there isn’t anyway to consume that information using these tools directly unless you opt for a customized solution (involving code) and not a configurable one.

So you start to think harder and in the process start making things more complicated, it’s a “DEV TRAIT” there is a natural affinity towards complicated solutions Smile …you also start looking into code based solution or maybe 3rd party solutions to make it happen.

Wait a min…need a break!!!

Now it sounds familiar isn’t it…there is a problem and then you start looking for a solution, a solution comes to your mind but has some limitations, you start looking for addressing that limitation next and in the process you get carried far far away from the actual problem that you were trying to solve…has happened to me multiple times, not sure about you?

I think this is exactly the time to take a break…and then comes the eureka moment!

Any external data items that are brought into SharePoint using BCS are not stored in SharePoint’s content database. Here is blurb from an MSDN article just in case you want to check on that.

Notice that unlike a SharePoint list whose data is stored in the SharePoint content database, the data in an external list is stored only in the external system. External data is brought into the SharePoint list at run time when you navigate to the list.

So there is a simple solution now to the reporting\analytics need; instead of going around the way and accessing the data from the SharePoint’s external list we can use the SQL Server database table\view directly and access it. Since the (committed)information always remains the same you will get the same results as the SharePoint list and you could build fancy report or do pivoting using excel etc. And it worked just fine for this situation.

More than the technical part of it, the intent of sharing this incidence was to highlight how we at times get trapped in our own thought process and force ourselves into a different direction than what we initially desired!

Advertisements
Posted in SharePoint, SharePoint 2010 | Tagged , , , | Leave a comment

Multiple consecutive events firing for wiki pages

 

Scenario:

Let’s say we have a wiki library which has a event receiver attached to the “ItemAdded” event. We write some piece of code that is supposed to fire when an item is added. For the sake of simplicity, let’s say we want to audit who created a new wiki page, so we log it in some way. After writing & deploying the code what we observe is that not only is the “ItemAdded” event fired but also the “ItemUpdated” event fires. Why does SharePoint do that?

Well, the cause is pretty obvious “a” wiki page is created and right after it’s created it’s opened in an edit mode, hence the two consecutive events. Nice article by Damon BTW in the context (Managing ItemUpdating and ItemUpdated Events Firing Twice in a SharePoint Item Event Receiver).

Challenge:

We just want to log\track “a” single event, since it’s very much possible that the user may not want to edit the page or may just cancel it out.

Q:how do we conditionally fire the event?

One possible solution:

Before we look at the possible solution, couple of facts:

  1. As you may know that each event (ItemAdding, ItemAdded, etc. check this out for the complete listing of SharePoint Events) generates a new “instance” of the class that extends the base class SPItemEventReceiver.
  2. The ItemAdded event is always going to fire before the ItemUpdated event, irrespective of the Synchronization property.

The issue is to persist the fact that a new wiki page is being added and preserve it between the “ItemAdded” event and the “ItemUpdated” event and then based on that information, conditionally fire the ItemUpdated event. Here is where a “Key-Value” based collection can come to our rescue.

The approach:

Here is a high level mention of the steps…

  1. Create a static variable of a collection type  for e.g. Hashtable
  2. Use the “ListItemId” property from the event args (default name, properties) and store it as the “key” in the variable created in step 1; the value part being a flag or status of some kind…for e.g. I used “new”
  3. In the “ItemUpdated” event use the same variable created in step 1 and then get the flag\status using the same “ListItemId” property from the event args. Once the value if available, we can then do the conditionally processing.

So even though, there are multiple events firing in there(which is technically correct), we can still isolate them and conditionally do some processing. One important thing to remember is that once we have used the value in the “ItemUpdated” event from the collection variable, we need to delete that so that the variable is light-weight and doesn’t cause unnecessary load.

Sample (incomplete) code:

public class EventReceiver1 : SPItemEventReceiver
{

    static Hashtable listItemStatus = new Hashtable();
   
    public override void ItemAdded(SPItemEventProperties properties)
    {
        base.ItemAdded(properties);
        Debug.WriteLine(“wiki – item added at: ” + DateTime.Now);

        EventReceiver1.listItemStatus.Add(properties.ListItemId, “new”);

    }

    public override void ItemUpdated(SPItemEventProperties properties)
    {
        base.ItemUpdated(properties);

        string listIdStatus = EventReceiver1.listItemStatus[properties.ListItemId] == null ?
                string.Empty :
                EventReceiver1.listItemStatus[properties.ListItemId].ToString();

        if (!String.IsNullOrEmpty(listIdStatus) &&
                String.Compare(listIdStatus, “new”) != -1)
        {
            Debug.WriteLine(“wiki – new item added; do nothing at: ” + DateTime.Now);
            //very important to get rid of it – can be moved to finally block as well
            EventReceiver1.listItemStatus.Remove(properties.ListItemId);
        }
        else {
            Debug.WriteLine(“wiki – regular item updated at: ” + DateTime.Now);
        }
    }
}

The same logic can be extended for other lists or libraries as appropriate to their respective behavior for the various events.

Hope this helps…

Posted in SharePoint 2010 | Tagged | Leave a comment

Cloud Computing – Nice Presentation

Ubuntu released 9.10 today and out of all the mixed reviews I read, I liked the presentation by Simon Wardley.

Not only because of the subject but because of the way he presented it… I am sure you would either love it or hate it but you simply can’t ignore it 🙂
Posted in Cloud Computing | Leave a comment

HardDrive to RAM – Excel Services

 

Quick Background…

The building blocks for Excel Services…

  • Excel Calculation Services

The ECS takes care of all the calculations that happen on the server side, opening the files from the trusted location for the same and refreshing the workbooks etc. It can be considered the server counter part of the Excel client that we have.

It also takes care about the security aspect of it, meaning not allowing accessing of unauthorized external data sources etc.

It can be load balanced across servers, can be installed separate from the WFE server and also helps in caching the sheets, graphs, external query results and the state of ongoing calculation results.

  • Excel Web Access

As the name indicates, excel web access is responsible for allowing access to sheets via browser and is rendered as HTML.

Apart from saving the changes, one can do almost everything that is possible in the excel client with no client installation on the end user’s machine.

It is available in form of a web part and can be used to display whole or part of excel sheet, the only constraint being that the file has to come from a trusted location.

  • Excel Web Service

This is the developers perspective of the story and allows to do all the stuff programmatically.

Things like accessing a centralized calculation formula using different parameters can be helpful in generalizing things across the organization in terms of duplication of efforts and avoiding errors due to multiple versions of business rules. 

  • Excel Calculation Service Proxy

This sits between the excel web access and the excel web service and takes care of the coordination business.

It can also do load balancing when there are multiple servers or else in a single server scenario it has nothing much to do other than handing over the request.

  • Steps to configure Excel Services

There are two steps required before the users can start using excel services, enabling the services and configuring the trusted location for the files. Files that are not stored in the trusted location can’t be used in excel services. The trusted location can be be a document library, a URL or a shared folder.

Enabling of the services can be done from the central administration web site from operations\services on server, by clicking the start link.

The trusted file location can be configured from the shared server administration section in the central administration. One has to pick the shared service provider and then go to excel services management\trusted file locations; here we can add\edit or delete the said file locations.

 

Want to keep it short, there are still other aspects to this topic, like the publishing, named parameters, security, performance, external data sources etc etc. They come after the basics components mentioned above have been put in place and configured.

Going through the above notes would solve the purpose for me, you are most welcome to add more to it though 🙂

Posted in HardDrive to RAM | Leave a comment

HardDrive to RAM

 

ya ya I know you are thinking, what a weird name for a blog entry!!!

But, hey everyone, here is the incidence that compelled me to take this route…

Place & Time: Friday 13th 1:19 PM, Mountain View, CA

I am sharing a nice joke with my colleagues and out of the blues, the senior guy says, hey aseem, I have some stuff for you; there is a customer coming and he has a desire to go through the beautiful world of Excel Services, can you contribute your two cents in this? and then there were couple of other business details that he talked about (believe me you won’t be interested in knowing those).

In my mind, I am like… Shoot!!! I just returned from another customer and he wanted to talk about SSRS and other related stuff so I had gone through all the SSRS stuff, prior to that it was with .Net\WPF & SSRS and before that it was Performance Point, Proclarity and SSIS and then something else…I suddenly realized that it had been quite a time I had touched the SharePoint stuff 😦

It’s not like I am not comfortable with it or something, on the contrary I have done presentations and POCs on multiple occasions on SharePoint topics… it’s only that somehow my CPU\brain (as a matter of fact, I have observed the same issues with other brands as well:-P) simply keeps compressing the older stuff and aligns it down in the stack in the hard drive\memory and then if I have to talk about a subject, I have to dig it out and refresh it, bring it into RAM\upper memory…:-)

It has been my experience though, that if the basic concepts are clear it becomes very easy to recall the stuff and quickly get ready for a presentation or solve a business problem. I am still a staunch believer of this and these series that I would be blogging about are like a reference for me, to keep my basic concepts handy, organized and most important short and sweet. At the same time I wish to share the same with everyone and would be happy if it is of any use to anyone, because I also believe that the best way to gain knowledge is to share it with other and not be afraid of making mistakes…cause that’s how you learn.

Do join me in this effort and please be patient with my innocent mistake 🙂 Also please be advised that, due to personal opinion, conflicts in terms of important information that should be present and that may have been skipped, are very likely to happen. In that case I would recommend using the latest online help. They are a wonderful pieces of work, where in lot of people have put lots of efforts in those and more important they are the complete reference.

Posted in HardDrive to RAM | Leave a comment

Handling TIFF data in Reporting Services

So, here is an interesting stuff that we did recently as a part of the SSRS POC. The very nature of the requirement makes it very interesting and exciting.

The Requirement…

There are a couple of pages in the TIFF format (that might come from FAX or some other application) and they need to be brought in and displayed via SSRS along with the other data fields. The header and the footer along with the summary data stays constant though.

The challenge…

Out of the various formats that SSRS supports out of the box, TIFF is not present 🙂 Although we can use the BMP mime type to display the first page, but nothing beyond that. So the deal is to display and merge the individual pages along with the other data.

The Solution…( of course one of the possible ones 🙂 )

The solution can be broken into smaller parts consisting of the multiple technologies that come together to address this issue and each of them have their own significant impact.

To start with, the format of the file that we are dealing with has a major impact, then comes the set of API that is leveraged to break and extract the information; storing these individual pieces of information is another important aspect and then finally comes the SSRS part which displays the overall stuff. Let’s take a look at each of them individually.

TIFF is a flexible file format that  is capable of containing the images, text, JPEG and other types of data all in a single file. One can find more information here.

The other essential part of the solutions in the new wonderful stuff that windows presentation foundation has to offer as far as the processing of the various image format goes, check this Imaging Namespace. Here we get the APIs that can be utilized to break the TIFF file into individual frames (a frame is a term used in the APIs to address a single page in the TIFF file) and then merged along with the other relation database records.

As far as the storage goes, the solution is an attempt to do a proof of concept hence the basic way of storing the individual frames in files has been used; the focus was more on merging the TIFF files along with the relational database. This can be improved later on. The individual frames stored on the disk in the form of small files can then be saved into the database in a varbinary column.

Also, the process of breaking the TIFF file into smaller files and storing it into the relevant table can be automated via a trigger or in a batch form via a job.

Once the table has the data for the individual frames, it can then be (inner) joined with the other table holding the records for the header or footer and other summary data and can be displayed in the report using an image control. The sample code below doesn’t use that structure though. It dumps all the stuff in a single table for the sake of convenience.

The Code…

The code that is made available uses a sample TIFF file generated from the favorite Adventureworks database and gets the top 2 products for each sub category.

To generate this sample file

1. Use the following query as the dataset

2. Preview it in the simple SSRS report

3. Save it in the TIFF format from the preview pane

 

SELECT sc.Name AS subCategory, p.Name AS productName
FROM Production.Product AS p INNER JOIN Production.ProductSubcategory AS sc
    ON p.ProductSubcategoryID = sc.ProductSubcategoryID
WHERE p.ProductID IN
    (SELECT TOP (2) ProductID
    FROM Production.Product AS p1
    WHERE ProductSubcategoryID = p.ProductSubcategoryID)
ORDER BY subCategory, productName

 

Once the TIFF file is available, we would need a table to hold the individual pages. The following script can create one

CREATE TABLE [dbo].[TIFFTable](
    [CustomerName] [nvarchar](50) NULL,
    [SomeDetails] [nvarchar](50) NULL,
    [SomeMoreDetails] [nvarchar](50) NULL,
    [Image] [varbinary](max) NULL
)

 

After the table is created, the code available at this location creates a mini WPF application that, just for the sake of looking into the file and learning, allows one to browse the individual pages in the TIFF file and then later save it to the TIFFTable created above.

Before running the code remember to make changes in the code for…

1. FilePath value

2. Name of the sample TIFF file

3. DB connection string

Run the miniapp and after you are done browsing back and forth in the file, click the "Save" button; doing so would break the file and eventually it gets stored in TIFFTable. You can do so for a couple of different customers to generate the some data for the next report. This data can then be displayed using the SSRS report in a normal fashion. The image control can be used to point to the varbinary column. Note that even though the format of the individual file is TIFF, we can change the format using the appropriate API is required. The report RDL has one way of showing the data; it groups data based on the customers we enter using the miniapp. You can always alter it and use it as appropriate. A screenshot of the miniapp follows…

 

Hope this piece of information is of some use to you…and if you come across a better way of handling this situation please share it with me 🙂

Posted in Reporting Services | Leave a comment

Multiple measure based KPIs for Performance Point

This blog can be considered as a continuation to the earlier blog that I had put up regarding the various ways of using Multiple Measures in Performance Point. This blog deals more with the monitoring part of of the performance point trio and walks through the process that one could possibly use to create a KPI that consumes data from a model that handles more than one measure. The focus here is on the first approach, "Option One: Structure does the trick", i.e. using a dimension which is solely there to hold the various measures in the model (cube). The other approach of using the assumption model is pretty straight forward as one has to just include the appropriate assumption model in a particular model and s\he can use it right away in the dashboard designer.

So continuing with the previous scenario, we are interested in using two measures say the PowerCount & the ChairCount for all the rooms and want to show that in a KPI, which could later on be a part of a scorecard. These KPIs help us find how well we were able to match the number of member count in a room with the present number of power output or chair counts. This is just an example to put the concept across and shouldn’t be related to anything in real life. The trick here is to understand the basics of the KPI and leverage it to come up with a business solution.

Let’s start from the basics…when one creates a KPI he typically uses two metrics, the actual and the target. After he mentions the two metrics, he has to do the data mapping for each. Step one would be to create KPIs for both the measures and then step two would be to use them in a scorecard along with other dimensions; mostly a time dimension.

Creating the KPI…

The process for creating the two KPIs would be the same except a slight change as mentioned below. Individual steps follow…

1. Create a blank KPI.

2. Use the published cube as a data source via the data source mapping for the actual.

3. Select the default measure, "value".

4. In the "Select a dimension" section select the "measure" dimension.

5. Change the member selection from "default" to either ChairCount Or PowerCount. This along with step# 4 does the trick and gets the slice out of the cube that we are interested in.

6. Repeat the same steps for target. One could possibly use a different data source & dimension combination for this.

7. Repeat the said sequence for the other KPI as well; the difference would be in the step# 5. One would have to choose the other member in the dimension than that he choose earlier.

Creating the ScoreCard…

Once the KPIs have been created, one could use them in a scorecard and also add another dimension(s) in the columns, typically time or something else as per requirement to generate an appropriate context in the scorecard.

In the following figure, I have used a day-wise hierarchy to display the two KPIs for a couple of days. The target value that I have used is 10 with the scoring pattern being, closer to target is better; also, all of the days don’t have data to show. One could alter the values displayed in the blank cells by changing couple of properties.

Once this is in place the other dimensions can be added as per requirement.

Stay tuned some more stuff to follow…:-)

 

Posted in PerformancePoint | Leave a comment