We will review BI architecture for more details and keep delving into the details of BI solution with a list of interesting BI architecture designs that we want to share.
In our previous post we have described optimization of the ETL process in one of our BI solutions for leading insurance brokerage company. In that article we have slightly covered a subject of Business Intelligence architecture. So here are more details:
We are implementing big enterprise solution with a large amount of member for one of our customers – currently this is around 400 000 members in the system and it started to grow very fast since we started to upload more and more customers to the system. One of the very next tasks – were to build up to date Reports that will provide different statistics and reports.
For example – Total premium report, Gender report (Count of Males and Females in system), New Employee Additions to the system Report and so on.
Calculate all that numbers and reports in real time once user will go to Report section – will be slow operation and it will use SQL Server very hard. As per specification – no need to calculate it in real time and this is Ok if we will provide a report with system state actual to the previous date. We decided to implement BI reports.
We received very good specification (Reports Specification) and started think how to implement BI architecture in the best way:
We are happy to start new exciting category of our blog - Business Intelligence articles.
After a list of successfully designed and implemented BI solutions I want to share our experience and note all complex moments and issues that we have been faced.
Business Intelligence (BI) - the set of techniques and tools for data transformation into useful information for business analysis purposes.
In this article I want to share our experience with ETL (Extract, Transform and Load) optimization.
Extract, Transform, Load (ETL) - is the base for a data warehousing, these 3 processes manage any data warehouse. Extract - acquire data from data sources, Transform - change data in the proper format, Load - push data to final target (data mart, data warehouse).
We were designing and implementing BI solution for insurance brokerage company Willis Towers Watson.
It was common BI architecture following all general rules:
Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed from anywhere in the world via HTTP or HTTPS. A single blob can be hundreds of gigabytes in size, and a single storage account can contain up to 100TB of blobs. Common uses of Blob storage include:
- Serving images or documents directly to a browser
- Storing files for distributed access
- Streaming video and audio
- Performing secure backup and disaster recovery
- Storing data for analysis by an on-premises or Windows Azure-hosted service
Windows Azure Blob has next hierarchy:
In this article I will show you how to write logs in Blob table. As log manager in the system I will use Log4net because Blob gives us a great possibility to work with this Log4net.
Log4Net is a popular logging framework, and if you have an existing application that you wish to move to Azure computer, you probably want to avoid rewriting your application to use another logging framework. Luckily, keeping Log4Net as your logging tool in Azure is certainly possible, but there are a few hoops you have to jump through to get there. So we will write our logs from project to some blob table. For this purpose we can use our project from the previous topic that already have a connection to Azure and to Blob Storage account. So let's start.
In very first step we need to add the Log4net library to our project. We can use NuGet for this purpose. So we should open our MvcApplicationWebRole project and right click on folder References and select “Manage NuGet Packages”. NuGet will occur. Enter “log4net” in the search window. And press install when it find this library:
Add Log4net library to project