Entity Framework 6 Update Taking REALLY Long Time

I had a strange issue updating the Entity Framework model against a newly built SQL Server 2014 database.  For some reason, it was talking a REALLY long time, even hours to connect.  Using SQL Server Profiler, we saw some activity which never completed.  I couldn’t figure it out until someone helped me by sending me this link: https://entityframework.codeplex.com/workitem/2445

Other users had indicated that Cumulative Update 5 needed to be installed and so this contained the resolution to the issue.  If you don’t want to install Cumulative Updates, changing the compatibility level to SQL Server 2012 worked just fine as well.  Changing the compatibility level immediately takes effect without server level restarts.

Advertisements

Biztalk 2013 R2 Database Selection Error on Deployment

I’ve been involved in setting up a Biztalk 2013 R2 instance, and it’s been a good learning opportunity for me. We ran into an interesting issue today, having to do with SQL Server permissions. The environment I work in is pretty locked down as far as permissions are concerned, and in trying to deploy a Biztalk application to the development environment and on entering the database server name in the Server field, the database could not be selected. If the defaults were supplied by default, everything was OK. But on entering the server name, the selector for the Biztalk Configuration database said that it couldn’t connect to the server to retrieve the database list. It failed to pull the names of the databases available on the server for my AD account.

While some sites said adding sysadmin in SQL Server would help, what actually worked was granting the “View All Databases” permission in SQL Server. So by adding this database permission, that resolved the issue. I was glad it was that because I wasn’t about to get sysadmin privileges :-D.

ASP.NET MVC Dynamic Reference Data Loading

MVC, like any other language, is a tool that has plenty of capabilities when rendering client markup; one method call can render pages of client markup. Using a server-side approach can simplify some of the rendering of content needed. As a for instance, here is one common situation I often use a server-side approach to rendering content within an application. An application may have a Kendo DropDownList, which is highly dependent on a list of some data in JSON format. The initialization script looks like the following:

$("DropDownList").kendoDropDownList({
   dataSource [{ .. }],
   dataTextField: "Text",
   dataValueField: "Value"
});

Typically, the data source is some lookup data source; in most applications, lookup data typically contains a key and value. The dataTextField and dataValueField properties map to values within these It’s helpful to have lookup data that’s standardized, because it’s easy to create a helper function that does the following:

@Helper LookupData(IList data)
{
   return NewtonSoft.JSON.JsonConvert.SerializeObject(data);
}

This helper converts data to JSON, which is used in the script below.

$("DropDownList").kendoDropDownList({
   dataSource @LookupData(Model.CustomerTypes),
   dataTextField: "Text",
   dataValueField: "Value"
});

Here we take a collection of reference items and use that to convert it to JSON. Really, that approach may not be that useful, but where you can find use is if you have a common reference table loader object. Using this, the helper function can automatically retrieve a common dataset based on a given type of reference data:

@Helper LookupData(Type referenceType)
{
    var container = Framework.Container; // this is a singleton reference to the dependency injection container
    var referenceData = container.GetInstance(); //A utility class to load common reference data

    return JSONConvert.DeserializeObject(referenceData.GetValues(referenceType));
}

The reference data uses an IReferenceTableManager class to perform the work. Behind the scenes, this could be anything, including Entity Framework. Below is one possible solution:

public class ReferenceValue
{
    public int ID { get; set; }

    public string Text { get; set; }
}

public interface IReferenceTableManager { 
   IEnumerable GetValues(Type type);
}

public class EFReferenceTableManager : IReferenceTableManager {
  public IEnumerable GetValues(Type type)
  {
     var ctx = new EntityFrameworkContextInstance();

     //From a given type, there are multiple ways to get the data from the DB; I am not
     //aware of a way to query by type (at least easily). There may be a better way to do this,
     // but for the time being, the two possible ways to do query referenced data dynamically
     //by generics are: using reflection or, using a long if statement and type checking like:
     if (type.Equals(typeof(Countries))) {
       return ctx.Set().ToList();
     }
     else if (type.Equals(typeof(States))) {
       return ctx.Set().ToList();
     }
     .
     .
  }

}

Not the best implementation; I’m sure there is something better out there, but you get the idea as to how this dependency can now be implemented dynamically. However, the question most will have is whether this is scope creep on behalf of the view (as to whether this is a responsibility of the controller). It’s also possible to have the lookup helper examine a given model and extract the data from there too. There are multiple ways to solve this problem.

Alternative Tricks To Importing Data From Excel

I often find myself with the task of importing some dataset into a database, and the source of that data is usually Excel. The common approach to importing data into a database like SQL Server is to import the data into a table. While that usually works, there are times where that doesn’t work, because the data is non-standard, or for who knows what reason.

The spreadsheet I’m about to show you will easily import, but let’s assume it doesn’t for some reason. If we had a datasheet like this:

AccountNumber LastName FirstName Active
12345 Smith Tom Yes
12643 Peter Tom Yes
2354365 Smith Amy No

Nothing complicated here. But assuming we can’t get it in, one trick i have found myself doing is using some Excel functions that generate a SQL statement. For instance, if we did the following equation in Excel:

=CONCATENATE("insert into Customers (AccountNumber, LastName, FirstName) values ('", A4, "', '", B4, "', '", C4, "')")

Concatenate is an Excel function, very similar to .NET String.Concat method, that takes each string and combines them into one. Instead of using the plus sign to concatenate values, each string entered, separated by commas, is concatenated together, which may also include a cell value. The following above takes the values in the row 4 set of ABC cells and combines them into the template insert statement.

If we copy this function across all of the rows, it would generate a bulk insert statement like:

insert into Customers (AccountNumber, LastName, FirstName) values ('12345', 'Smith', 'Tom')
insert into Customers (AccountNumber, LastName, FirstName) values ('12643', 'Peter', 'Tom')
insert into Customers (AccountNumber, LastName, FirstName) values ('2354365', 'Smith', 'Amy')

Again nothing complicated; however, this can be useful. It can even be used to not include inserts if the user is inactive; if we update our formula:

=IF(D2="Yes",CONCATENATE("insert into Customers (AccountNumber, LastName, FirstName) values ('",A2,"', '",B2,"', '",C2,"')"),"")

This would only output the two active records:

insert into Customers (AccountNumber, LastName, FirstName) values ('12345', 'Smith', 'Tom')
insert into Customers (AccountNumber, LastName, FirstName) values ('12643', 'Peter', 'Tom')

The inactive record didn’t copy. After generating the scripts, we can copy the entire bunch of statements and paste into SQL Server Management Studio, which will execute each statement separately. With this approach, we could also do updates; maybe we want to sync the active status (a boolean in this scenario). We could use the following formula:

=CONCATENATE("update dbo.Customers set Active = ", IF(D2 = "YES", "1", "0"), " where AccountNumber = '", A2, "'")

Which generates:

update dbo.Customers set Active = 1 where AccountNumber = '12345'
update dbo.Customers set Active = 1 where AccountNumber = '12643'
update dbo.Customers set Active = 0 where AccountNumber = '2354365'

This feature may not be useful as a whole, but one scenario it is useful in is to insert data into a table variable. Data in a table variable can be used to conduct research, do data massaging before inserting into a table, or whatever else you want. For instance, suppose we had the following script shell in SQL Server:

declare @output table
(
  ROWID int identity(1,1) primary key clustered not null,
  Account varchar(100),
  First varchar(100),
  Last varchar(100),
  Active bit
)

-- INSERT insert templated scripts here

select count(*) from @output where Account like 'BN%'

select count(*) from @output where Active = 1

select count(*) from @output where Active = 0

select count(*) from @output where Account like '%99'

We use the following formula to generate the inserts into the table:

=CONCATENATE("insert into @output values ('",A2,"', '",B2,"', '",C2,"', ",IF(D2="YES","1","0"),")")

Which produces the scripts:

insert into @output values ('12345', 'Smith', 'Tom', 1)
insert into @output values ('12643', 'Peter', 'Tom', 1)
insert into @output values ('2354365', 'Smith', 'Amy', 0)

And our final script is:

declare @output table
(
  ROWID int identity(1,1) primary key clustered not null,
  Account varchar(100),
  First varchar(100),
  Last varchar(100),
  Active bit
)

insert into @output values ('12345', 'Smith', 'Tom', 1)
insert into @output values ('12643', 'Peter', 'Tom', 1)
insert into @output values ('2354365', 'Smith', 'Amy', 0)


select count(*) from @output where Account like 'BN%'

select count(*) from @output where Active = 1

select count(*) from @output where Active = 0

select count(*) from @output where Account like '%99'

And know we’ve done analysis on our results without needing to create a SQL table; total time to create the script may be faster, depending on how proficient you are at Excel. Excel has a lot of nifty functions helpful for data conversion. Is this dirty? Yes. Is it a one-time throwaway solution? Yes. But it can be handy in a bind, and may simplify a one-time migration later on.