Azure Camp 2019

If you’re reading this I’m hoping you’re in the room for Azure Camp 2019 because otherwise there isn’t much point. This article is just a bunch of resources for you.

We have a code of conduct here https://confcodeofconduct.com/

Code

We’re going to use a couple of small sample projects during the day. They are on github at

Slides

There are some slides, some of them have secret commands in them. They are up on one drive here: https://1drv.ms/f/s!AmN_cJhXT0rP1yd-qMdkgcHgvKD0

Class Status

This serves a double purpose. It lets you update your status and it is the demo we’re building today. http://classstatus.aspnetmonsters.com/

Using NodaTime with Dapper

Originally posted to: https://www.davepaquette.com/archive/2019/03/27/using-noda-time-with-dapper.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

After my recent misadventures attempting to use Noda Time with Entity Framework Core, I decided to see what it would take to use Dapper in a the same scenario.

A quick recap

In my app, I needed to model an Event that occurs on a particular date. It might be initially tempting to store the date of the event as a DateTime in UTC, but that’s not necessarily accurate unless the event happens to be held at the Royal Observatory Greenwich. I don’t want to deal with time at all, I’m only interested in the date the event is being held.

NodaTime provides a LocalDate type that is perfect for this scenario so I declared a LocalDate property named Date on my Event class.

public class Event
{
public Guid Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public LocalDate Date {get; set;}
}

Querying using Dapper

I modified my app to query for the Event entities using Dapper:

var queryDate = new LocalDate(2019, 3, 26);
using (var connection = new SqlConnection(myConnectionString))
{
await connection.OpenAsync();
Events = await connection.QueryAsync<Event>(@"SELECT [e].[Id], [e].[Date], [e].[Description], [e].[Name]
FROM [Events] AS[e]");
}

The app started up just fine, but gave me an error when I tried to query for events.

System.Data.DataException: Error parsing column 1 (Date=3/26/19 12:00:00 AM - DateTime) —> System.InvalidCastException: Invalid cast from ‘System.DateTime’ to ‘NodaTime.LocalDate’.

Likewise, if I attempted to query for events using a LocalDate parameter, I got another error:

var queryDate = new LocalDate(2019, 3, 26);
using (var connection = new SqlConnection("myConnectionString"))
{
await connection.OpenAsync();

Events = await connection.QueryAsync<Event>(@"SELECT [e].[Id], [e].[Date], [e].[Description], [e].[Name]
FROM [Events] AS[e]
WHERE [e].[Date] = @Date", new { Date = queryDate });
}

NotSupportedException: The member Date of type NodaTime.LocalDate cannot be used as a parameter value

Fortunately, both these problems can be solved by implementing a simple TypeHandler.

Implementing a Custom Type Handler

Out of the box, Dapper already knows how to map to the standard .NET types like Int32, Int64, string and DateTime. The problem we are running into here is that Dapper doesn’t know anything about the LocalDate type. If you want to map to a type that Dapper doesn’t know about, you can implement a custom type handler. To implement a type handler, create a class that inherits from TypeHandler<T>, where T is the type that you want to map to. In your type handler class, implement the Parse and SetValue methods. These methods will be used by Dapper when mapping to and from properties that are of type T.

Here is an example of a type handler for LocalDate.

public class LocalDateTypeHandler : TypeHandler<LocalDate>
{
public override LocalDate Parse(object value)
{
if (value is DateTime)
{
return LocalDate.FromDateTime((DateTime)value);
}

throw new DataException($"Unable to convert {value} to LocalDate");
}

public override void SetValue(IDbDataParameter parameter, LocalDate value)
{
parameter.Value = value.ToDateTimeUnspecified();
}
}

Finally, you need to tell Dapper about your new custom type handler. To do that, register the type handler somewhere in your application’s startup class by calling Dapper.SqlMapper.AddTypeHandler.

Dapper.SqlMapper.AddTypeHandler(new LocalDateTypeHandler());

There’s a NuGet for that

As it turns out, someone has already created a helpful NuGet package containing TypeHandlers for many of the NodaTime types so you probably don’t need to write these yourself. Use the Dapper.NodaTime package instead.

Wrapping it up

TypeHandlers are a simple extension point that allows for Dapper to handle types that are not already handled by Dapper. You can write your own type handlers but you might also want to check if someone has already published a NuGet package that handles your types.

Optimistic Concurrency Tracking with Dapper and SQL Server

Originally posted to: https://www.davepaquette.com/archive/2019/03/20/optimistic-concurrency-tracking-with-dapper-and-sql-server.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

In today’s post, we explore a pattern to prevent multiple users (or processes) from accidentally overwriting each other’s change. Given our current implementation for updating the Aircraft record, there is potential for data loss if there are multiple active sessions are attempting to update the same Aircraft record at the same time. In the example shown below, Bob accidentally overwrites Jane’s changes without even knowing that Jane made changes to the same Aircraft record

Concurrent Updates

The pattern we will use here is Optimistic Offline Lock, which is often also referred to as Optimistic Concurrency Control.

Modifying the Database and Entities

To implement this approach, we will use a rowversion column in SQL Server. Essentially, this is a column that automatically version stamps a row in a table. Any time a row is modified, the rowversion column will is automatically incremented for that row. We will start by adding the column to our Aircraft table.

ALTER TABLE Aircraft ADD RowVer rowversion

Next, we add a RowVer property to the Aircraft table. The property is a byte array. When we read the RowVer column from the database, we will get an array of 8 bytes.

public class Aircraft 
{
public int Id { get; set; }
public string Manufacturer {get; set;}
public string Model {get; set;}
public string RegistrationNumber {get; set;}
public int FirstClassCapacity {get; set;}
public int RegularClassCapacity {get; set;}
public int CrewCapacity {get; set;}
public DateTime ManufactureDate {get; set; }
public int NumberOfEngines {get; set;}
public int EmptyWeight {get; set;}
public int MaxTakeoffWeight {get; set;}
public byte[] RowVer { get; set; }
}

Finally, we will modify the query used to load Aircraft entities so it returns the RowVer column. We don’t need to change any of the Dapper code here.

public async Task<Aircraft> Get(int id)
{
Aircraft aircraft;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
SELECT
Id
,Manufacturer
,Model
,RegistrationNumber
,FirstClassCapacity
,RegularClassCapacity
,CrewCapacity
,ManufactureDate
,NumberOfEngines
,EmptyWeight
,MaxTakeoffWeight
,RowVer
FROM Aircraft WHERE Id = @Id";
aircraft = await connection.QuerySingleAsync<Aircraft>(query, new {Id = id});
}
return aircraft;
}

Adding the Concurrency Checks

Now that we have the row version loaded in to our model, we need to add the checks to ensure that one user doesn’t accidentally overwrite another users changes. To do this, we simply need to add the RowVer to the WHERE clause on the UPDATE statement. By adding this constraint to the WHERE clause, we we ensure that the updates will only be applied if the RowVer has not changed since this user originally loaded the Aircraft entity.

public async Task<IActionResult> Put(int id, [FromBody] Aircraft model)
{
if (id != model.Id)
{
return BadRequest();
}

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
UPDATE Aircraft
SET Manufacturer = @Manufacturer
,Model = @Model
,RegistrationNumber = @RegistrationNumber
,FirstClassCapacity = @FirstClassCapacity
,RegularClassCapacity = @RegularClassCapacity
,CrewCapacity = @CrewCapacity
,ManufactureDate = @ManufactureDate
,NumberOfEngines = @NumberOfEngines
,EmptyWeight = @EmptyWeight
,MaxTakeoffWeight = @MaxTakeoffWeight
WHERE Id = @Id
AND RowVer = @RowVer";

await connection.ExecuteAsync(query, model);
}

return Ok();
}

So, the WHERE clause stops the update from happening, but how do we know if the update was applied successfully? We need to let the user know that the update was not applied due to a concurrency conflict. To do that, we add OUTPUT inserted.RowVer to the UPDATE statement. The effect of this is that the query will return the new value for the RowVer column if the update was applied. If not, it will return null.

public async Task<IActionResult> Put(int id, [FromBody] Aircraft model)
{
byte[] rowVersion;
if (id != model.Id)
{
return BadRequest();
}

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
UPDATE Aircraft
SET Manufacturer = @Manufacturer
,Model = @Model
,RegistrationNumber = @RegistrationNumber
,FirstClassCapacity = @FirstClassCapacity
,RegularClassCapacity = @RegularClassCapacity
,CrewCapacity = @CrewCapacity
,ManufactureDate = @ManufactureDate
,NumberOfEngines = @NumberOfEngines
,EmptyWeight = @EmptyWeight
,MaxTakeoffWeight = @MaxTakeoffWeight
OUTPUT inserted.RowVer
WHERE Id = @Id
AND RowVer = @RowVer";
rowVersion = await connection.ExecuteScalarAsync<byte[]>(query, model);
}

if (rowVersion == null) {
throw new DBConcurrencyException("The entity you were trying to edit has changed. Reload the entity and try again.");
}
return Ok(rowVersion);
}

Instead of calling ExecuteAsync, we call ExecuteScalarAsync<byte[]>. Then we can check if the returned value is null and raise a DBConcurrencyException if it is null. If it is not null, we can return the new RowVer value.

Wrapping it up

Using SQL Server’s rowversion column type makes it easy to implement optimistic concurrency checks in a .NET app that uses Dapper.

If you are building as REST api, you should really use the ETag header to represent the current RowVer for your entity. You can read more about this pattern here.

Managing Database Transactions in Dapper

Originally posted to: https://www.davepaquette.com/archive/2019/02/06/managing-transactions-in-dapper.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

In today’s post, we explore a more complex scenario that involves executing multiple write operations. In order to ensure consistency at the database level, these operations should all succeed / fail together as a single transaction. In this example, we will be inserting a new ScheduledFlight entity along with an associated set of Flight entities.

As a quick reminder, a Flight represents a particular occurrence of a ScheduledFlight on a particular day. That is, it has a reference to the ScheduledFlight along with some properties indicating the scheduled arrival and departure times.

public class Flight 
{
public int Id {get; set;}
public int ScheduledFlightId {get; set;}
public ScheduledFlight ScheduledFlight { get; set;}
public DateTime Day {get; set;}
public DateTime ScheduledDeparture {get; set;}
public DateTime ScheduledArrival {get; set;}
}
public class ScheduledFlight 
{
public int Id {get; set;}
public string FlightNumber {get; set;}

public int DepartureAirportId {get; set;}
public Airport DepartureAirport {get; set;}
public int DepartureHour {get; set;}
public int DepartureMinute {get; set;}

public int ArrivalAirportId {get; set;}
public Airport ArrivalAirport {get; set;}
public int ArrivalHour {get; set;}
public int ArrivalMinute {get; set;}

public bool IsSundayFlight {get; set;}
public bool IsMondayFlight {get; set;}
// Some other properties
}

Inserting the ScheduledFlight

Inserting the ScheduledFlight and retrieving the database generated id is easy enough. We can use the same approach we used in the previous blog post.

// POST api/scheduledflight
[HttpPost()]
public async Task<IActionResult> Post([FromBody] ScheduledFlight model)
{
int newScheduledFlightId;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var insertScheduledFlightSql = @"
INSERT INTO [dbo].[ScheduledFlight]
([FlightNumber]
,[DepartureAirportId]
,[DepartureHour]
,[DepartureMinute]
,[ArrivalAirportId]
,[ArrivalHour]
,[ArrivalMinute]
,[IsSundayFlight]
,[IsMondayFlight]
,[IsTuesdayFlight]
,[IsWednesdayFlight]
,[IsThursdayFlight]
,[IsFridayFlight]
,[IsSaturdayFlight])
VALUES
(@FlightNumber
,@DepartureAirportId
,@DepartureHour
,@DepartureMinute
,@ArrivalAirportId
,@ArrivalHour
,@ArrivalMinute
,@IsSundayFlight
,@IsMondayFlight
,@IsTuesdayFlight
,@IsWednesdayFlight
,@IsThursdayFlight
,@IsFridayFlight
,@IsSaturdayFlight);
SELECT CAST(SCOPE_IDENTITY() as int)";
newScheduledFlightId = await connection.ExecuteScalarAsync<int>(insertScheduledFlightSql, model);
}
return Ok(newScheduledFlightId);
}

According to the bosses at Air Paquette, whenever we create a new ScheduledFlight entity, we also want to generate the Flight entities for the next 12 months of that ScheduledFlight. We can add a method to the ScheduledFlight class to generate the flight entities.

NOTE: Let’s just ignore the obvious bugs related to timezones and to flights that take off and land on a different day.

public IEnumerable<Flight> GenerateFlights(DateTime startDate, DateTime endDate)
{
var flights = new List<Flight>();
var currentDate = startDate;

while (currentDate <= endDate)
{
if (IsOnDayOfWeek(currentDate.DayOfWeek))
{
var departureTime = new DateTime(currentDate.Year, currentDate.Month, currentDate.Day, DepartureHour, DepartureMinute, 0);
var arrivalTime = new DateTime(currentDate.Year, currentDate.Month, currentDate.Day, ArrivalHour, ArrivalMinute, 0);
var flight = new Flight
{
ScheduledFlightId = Id,
ScheduledDeparture = departureTime,
ScheduledArrival = arrivalTime,
Day = currentDate.Date
};
flights.Add(flight);
}
currentDate = currentDate.AddDays(1);
}
return flights;
}
public bool IsOnDayOfWeek(DayOfWeek dayOfWeek)
{
return (dayOfWeek == DayOfWeek.Sunday && IsSundayFlight)
|| (dayOfWeek == DayOfWeek.Monday && IsMondayFlight)
|| (dayOfWeek == DayOfWeek.Tuesday && IsTuesdayFlight)
|| (dayOfWeek == DayOfWeek.Wednesday && IsWednesdayFlight)
|| (dayOfWeek == DayOfWeek.Thursday && IsThursdayFlight)
|| (dayOfWeek == DayOfWeek.Friday && IsFridayFlight)
|| (dayOfWeek == DayOfWeek.Saturday && IsSaturdayFlight);
}

Now in the controller, we can add some logic to call the GenerateFlight method and then insert those Flight entities using Dapper.

// POST api/scheduledflight
[HttpPost()]
public async Task<IActionResult> Post([FromBody] ScheduledFlight model)
{
int newScheduledFlightId;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var insertScheduledFlightSql = @"
INSERT INTO [dbo].[ScheduledFlight]
([FlightNumber]
,[DepartureAirportId]
,[DepartureHour]
,[DepartureMinute]
,[ArrivalAirportId]
,[ArrivalHour]
,[ArrivalMinute]
,[IsSundayFlight]
,[IsMondayFlight]
,[IsTuesdayFlight]
,[IsWednesdayFlight]
,[IsThursdayFlight]
,[IsFridayFlight]
,[IsSaturdayFlight])
VALUES
(@FlightNumber
,@DepartureAirportId
,@DepartureHour
,@DepartureMinute
,@ArrivalAirportId
,@ArrivalHour
,@ArrivalMinute
,@IsSundayFlight
,@IsMondayFlight
,@IsTuesdayFlight
,@IsWednesdayFlight
,@IsThursdayFlight
,@IsFridayFlight
,@IsSaturdayFlight);
SELECT CAST(SCOPE_IDENTITY() as int)";
newScheduledFlightId = await connection.ExecuteScalarAsync<int>(insertScheduledFlightSql, model);

model.Id = newScheduledFlightId;
var flights = model.GenerateFlights(DateTime.Now, DateTime.Now.AddMonths(12));

var insertFlightsSql = @"INSERT INTO [dbo].[Flight]
([ScheduledFlightId]
,[Day]
,[ScheduledDeparture]
,[ActualDeparture]
,[ScheduledArrival]
,[ActualArrival])
VALUES
(@ScheduledFlightId
,@Day
,@ScheduledDeparture
,@ActualDeparture
,@ScheduledArrival
,@ActualArrival)";

await connection.ExecuteAsync(insertFlightsSql, flights);

}
return Ok(newScheduledFlightId);
}

Note that we passed in an IEnumerable<Flight> as the second argument to the ExecuteAsync method. This is a handy shortcut in Dapper for executing a query multiple times. Instead of writing a loop and calling ExecuteAsync for each flight entity, we can pass in a list of flights and Dapper will execute the query once for each item in the list.

Explicitly managing a transaction

So far, we have code that first inserts a ScheduledFlight, next generates a set of Flight entities and finally inserting all of those Flight entities. That’s the happy path, but what happens if something goes wrong along the way. Typically when we execute a set of related write operations (inserts, updates and deletes), we want those operations to all succeed or fail together. In the database world, we have transactions to help us with this.

The nice thing about using Dapper is that it uses standard .NET database connections and transactions. There is no need to re-invent the wheel here, we can simply use the transaction patterns that have been around in .NET since for nearly 2 decades now.

After opening the connection, we call connection.BeginTransaction() to start a new transaction. Whenever we call ExecuteAsync (or any other Dapper extension method), we need to pass in that transaction. At the end of all that work, we call transaction.Commit(). Finally, we wrap the logic in a try / catch block. If any exception is raised, we call transaction.Rollback() to ensure that none of those write operations are committed to the database.

[HttpPost()]
public async Task<IActionResult> Post([FromBody] ScheduledFlight model)
{
int? newScheduledFlightId = null;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var transaction = connection.BeginTransaction();

try
{
var insertScheduledFlightSql = @"
INSERT INTO [dbo].[ScheduledFlight]
([FlightNumber]
,[DepartureAirportId]
,[DepartureHour]
,[DepartureMinute]
,[ArrivalAirportId]
,[ArrivalHour]
,[ArrivalMinute]
,[IsSundayFlight]
,[IsMondayFlight]
,[IsTuesdayFlight]
,[IsWednesdayFlight]
,[IsThursdayFlight]
,[IsFridayFlight]
,[IsSaturdayFlight])
VALUES
(@FlightNumber
,@DepartureAirportId
,@DepartureHour
,@DepartureMinute
,@ArrivalAirportId
,@ArrivalHour
,@ArrivalMinute
,@IsSundayFlight
,@IsMondayFlight
,@IsTuesdayFlight
,@IsWednesdayFlight
,@IsThursdayFlight
,@IsFridayFlight
,@IsSaturdayFlight);
SELECT CAST(SCOPE_IDENTITY() as int)";
newScheduledFlightId = await connection.ExecuteScalarAsync<int>(insertScheduledFlightSql, model, transaction);

model.Id = newScheduledFlightId.Value;
var flights = model.GenerateFlights(DateTime.Now, DateTime.Now.AddMonths(12));

var insertFlightsSql = @"INSERT INTO [dbo].[Flight]
([ScheduledFlightId]
,[Day]
,[ScheduledDeparture]
,[ActualDeparture]
,[ScheduledArrival]
,[ActualArrival])
VALUES
(@ScheduledFlightId
,@Day
,@ScheduledDeparture
,@ActualDeparture
,@ScheduledArrival
,@ActualArrival)";

await connection.ExecuteAsync(insertFlightsSql, flights, transaction);
transaction.Commit();
}
catch (Exception ex)
{
//Log the exception (ex)
try
{
transaction.Rollback();
}
catch (Exception ex2)
{
// Handle any errors that may have occurred
// on the server that would cause the rollback to fail, such as
// a closed connection.
// Log the exception ex2
}
return StatusCode(500);
}
}
return Ok(newScheduledFlightId);
}

Managing database transactions in .NET is a deep but well understood topic. We covered the basic pattern above and showed how Dapper can easily participate in a transaction. To learn more about managing database transactions in .NET, check out these docs:

Wrapping it up

Using transactions with Dapper is fairly straight forward process. We just need to tell Dapper what transaction to use when executing queries. Now that we know how to use transactions, we can look at some more advanced scenarios like adding concurrency checks to update operations to ensure users aren’t overwriting each other’s changes.

Basic Insert Update and Delete with Dapper

Originally posted to: https://www.davepaquette.com/archive/2019/02/04/basic-insert-update-delete-with-dapper.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

In today’s post, we explore how easy it is to perform basic Insert, Update and Delete operations using the same Aircraft entity that we used in the first post in this series. Basically, instead of using Dapper’s QueryAsync extension method that we used to retrieve data, we will use the ExecuteAsync method.

As a quick reminder, here is the Aircraft class:

public class Aircraft 
{
public int Id { get; set; }

public string Manufacturer {get; set;}

public string Model {get; set;}

public string RegistrationNumber {get; set;}

public int FirstClassCapacity {get; set;}

public int RegularClassCapacity {get; set;}

public int CrewCapacity {get; set;}

public DateTime ManufactureDate {get; set; }

public int NumberOfEngines {get; set;}
}

NOTE: In these examples, I am ignoring some important aspects like validation. I want to focus specifically on the Dapper bits here but validation is really important. In a real-world scenario, you should be validating any data that is passed in to the server. I recommend using Fluent Validation.

Insert

Inserting a single new record is really easy. All we need to do is write an INSERT statement with parameters for each column that we want to set.

[HttpPost()]
public async Task<IActionResult> Post([FromBody] Aircraft model)
{
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var sqlStatement = @"
INSERT INTO Aircraft
(Manufacturer
,Model
,RegistrationNumber
,FirstClassCapacity
,RegularClassCapacity
,CrewCapacity
,ManufactureDate
,NumberOfEngines
,EmptyWeight
,MaxTakeoffWeight)
VALUES (@Manufacturer
,@Model
,@RegistrationNumber
,@FirstClassCapacity
,@RegularClassCapacity
,@CrewCapacity
,@ManufactureDate
,@NumberOfEngines
,@EmptyWeight
,@MaxTakeoffWeight)";
await connection.ExecuteAsync(sqlStatement, model);
}
return Ok();
}

The version of the ExecuteAsync method we used here accepts two parameters: a string containing the SQL statement to execute and an object containing the parameter values to bind to the statement. In this case, it is an instance of the Aircraft class which has properties with names matching the parameters defined in the INSERT statement.

Our Aircraft table’s Id column is an auto-incremented identity column. That means the primary key is generated by the database when the row is inserted. We will likely need to pass that value back to whoever called the API so they know how to retrieve the newly inserted Aircraft.

An easy way to get the generated Id is to add SELECT CAST(SCOPE_IDENTITY() as int) after the INSERT statement. The SCOPE_IDENTITY() function returns the last identity value that was generated in any table in the current session and current scope.

Now, since the SQL statement we are executing will be returning a single value (the generated id), we need to call ExecuteScalarAsync<int>. The ExecuteScalarAsync method executes a SQL statement that returns a single value whereas the ExecuteAsync method executes a SQL statement that does not return a value.

[HttpPost()]
public async Task<IActionResult> Post([FromBody] Aircraft model)
{
int newAircraftId;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var sqlStatement = @"
INSERT INTO Aircraft
(Manufacturer
,Model
,RegistrationNumber
,FirstClassCapacity
,RegularClassCapacity
,CrewCapacity
,ManufactureDate
,NumberOfEngines
,EmptyWeight
,MaxTakeoffWeight)
VALUES (@Manufacturer
,@Model
,@RegistrationNumber
,@FirstClassCapacity
,@RegularClassCapacity
,@CrewCapacity
,@ManufactureDate
,@NumberOfEngines
,@EmptyWeight
,@MaxTakeoffWeight);

SELECT CAST(SCOPE_IDENTITY() as int)";
newAircraftId = await connection.ExecuteScalarAsync<int>(sqlStatement, model);
}
return Ok(newAircraftId);
}

Update

Updating an existing entity is similar to inserting. All we need is a SQL statement containing an UPDATE statement that sets the appropriate columns. We also want to make sure we include a WHERE clause limiting the update only to the row with the specified Id.

Again, the parameters in the SQL statement match the names of the properties in our Aircraft class. All we need to do is call the ExecuteAsync method passing in the SQL statement and the Aircraft entity.

// PUT api/aircraft/id
[HttpPut("{id}")]
public async Task<IActionResult> Put(int id, [FromBody] Aircraft model)
{
if (id != model.Id)
{
return BadRequest();
}

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var sqlStatement = @"
UPDATE Aircraft
SET Manufacturer = @Manufacturer
,Model = @Model
,RegistrationNumber = @RegistrationNumber
,FirstClassCapacity = @FirstClassCapacity
,RegularClassCapacity = @RegularClassCapacity
,CrewCapacity = @CrewCapacity
,ManufactureDate = @ManufactureDate
,NumberOfEngines = @NumberOfEngines
,EmptyWeight = @EmptyWeight
,MaxTakeoffWeight = @MaxTakeoffWeight
WHERE Id = @Id";
await connection.ExecuteAsync(sqlStatement, model);
}
return Ok();
}

Delete

Deleting an entity is the easiest of the three operations since it only requires a single parameter: the unique Id to identify the entity being deleted. The SQL statement is a simple DELETE with a WHERE clause on the Id column. To execute the delete, call the ExecuteAsync method passing in the SQL statement and an anonymous object containing the Id to delete.

// DELETE api/aircraft/id
[HttpDelete("{id}")]
public async Task<IActionResult> Delete(int id)
{

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var sqlStatement = "DELETE Aircraft WHERE Id = @Id";
await connection.ExecuteAsync(sqlStatement, new {Id = id});
}
return Ok();
}

I really appreciate how simple delete is using Dapper. When using Entity Framework, delete requires you to first fetch the existing entity, then delete it. That requires 2 round trips to the database while the approach we used here only requires a single round trip.

Wrapping it up

Basic insert, update and delete operations are easy to implement using Dapper. Real world scenarios are often a little more complex and we will dig into some of those scenarios in future posts:

  • Bulk inserts, updates and deletes
  • Managing transactions
  • Optimistic concurrency checks

Paging Large Result Sets with Dapper and SQL Server

Originally posted to: http://www.davepaquette.com/archive/2019/01/28/paging-large-result-sets-with-dapper-and-sql-server.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

In today’s post, we explore paging through large result sets. Paging is a common technique that is used when dealing with large results sets. Typically, it is not useful for an application to request millions of records at a time because there is no efficient way to deal with all those records in memory all at once. This is especially true when rendering data on a grid in a user interface. The screen can only display a limited number of records at a time so it is generally a bad use of system resources to hold everything in memory when only a small subset of those records can be displayed at any given time.

Paged Table
Source: AppStack Bootstrap Template

Modern versions of SQL Server support the OFFSET / FETCH clause to implement query paging.

In continuing with our airline theme, consider a Flight entity. A Flight represents a particular occurrence of a ScheduledFlight on a particular day. That is, it has a reference to the ScheduledFlight along with some properties indicating the scheduled arrival and departure times.

public class Flight 
{
public int Id {get; set;}
public int ScheduledFlightId {get; set;}
public ScheduledFlight ScheduledFlight { get; set;}
public DateTime Day {get; set;}
public DateTime ScheduledDeparture {get; set;}
public DateTime ScheduledArrival {get; set;}
}
public class ScheduledFlight 
{
public int Id {get; set;}
public string FlightNumber {get; set;}

public int DepartureAirportId {get; set;}
public Airport DepartureAirport {get; set;}
public int DepartureHour {get; set;}
public int DepartureMinute {get; set;}

public int ArrivalAirportId {get; set;}
public Airport ArrivalAirport {get; set;}
public int ArrivalHour {get; set;}
public int ArrivalMinute {get; set;}

public bool IsSundayFlight {get; set;}
public bool IsMondayFlight {get; set;}
// Some other properties
}

Writing the query

As we learned in a previous post, we can load the Flight entity along with it’s related ScheduledFlight entity using a technique called multi-mapping.

In this case, loading all the flights to or from a particular airport, we would use the following query.

SELECT f.*, sf.*
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode

But this query could yield more results than we want to deal with at any given time. Using OFFSET/FETCH, we can ask for only a block of results at a time.

SELECT f.*, sf.*
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode
ORDER BY f.Day, sf.FlightNumber
OFFSET @Offset ROWS
FETCH NEXT @PageSize ROWS ONLY

Note that an ORDER BY clause is required when using OFFSET/FETCH.

Executing the Query

//GET api/flights
[HttpGet]
public async Task<IEnumerable<Flight>> Get(string airportCode, int page=1, int pageSize=10)
{
IEnumerable<Flight> results;

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
SELECT f.*, sf.*
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode
ORDER BY f.Day, sf.FlightNumber
OFFSET @Offset ROWS
FETCH NEXT @PageSize ROWS ONLY;
";

results = await connection.QueryAsync<Flight, ScheduledFlight, Flight>(query,
(f, sf) =>
{
f.ScheduledFlight = sf;
return f;
},
new { AirportCode = airportCode,
Offset = (page - 1) * pageSize,
PageSize = pageSize }
);
}

return results;
}

Here we calculate the offset by based on the page and pageSize arguments that were passed in. This allows the caller of the API to request a particular number of rows and the starting point.

One step further

When dealing with paged result sets, it can be useful for the caller of the API to also know the total number of records. Without the total number of records, it would be difficult to know how many records are remaining which in turn makes it difficult to render a paging control, a progress bar or a scroll bar (depending on the use case).

A technique I like to use here is to have my API return a PagedResults<T> class that contains the list of items for the current page along with the total count.

public class PagedResults<T>
{
public IEnumerable<T> Items { get; set; }
public int TotalCount { get; set; }
}

To populate this using Dapper, we can add a second result set to the query. That second result set will simply be a count of all the records. Note that the same WHERE clause is used in both queries.

SELECT f.*, sf.*
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode
ORDER BY f.Day, sf.FlightNumber
OFFSET @Offset ROWS
FETCH NEXT @PageSize ROWS ONLY;

SELECT COUNT(*)
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode

Now in our code that executes the query, we will the QueryMultipleAsync method to execute both SQL statements in a single round trip.

//GET api/flights
[HttpGet]
public async Task<PagedResults<Flight>> Get(string airportCode, int page=1, int pageSize=10)
{
var results = new PagedResults<Flight>();

using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
SELECT f.*, sf.*
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode
ORDER BY f.Day, sf.FlightNumber
OFFSET @Offset ROWS
FETCH NEXT @PageSize ROWS ONLY;

SELECT COUNT(*)
FROM Flight f
INNER JOIN ScheduledFlight sf ON f.ScheduledFlightId = sf.Id
INNER JOIN Airport a ON sf.ArrivalAirportId = a.Id
INNER JOIN Airport d ON sf.DepartureAirportId = d.Id
WHERE a.Code = @AirportCode OR d.Code = @AirportCode
";

using (var multi = await connection.QueryMultipleAsync(query,
new { AirportCode = airportCode,
Offset = (page - 1) * pageSize,
PageSize = pageSize }))
{
results.Items = multi.Read<Flight, ScheduledFlight, Flight>((f, sf) =>
{
f.ScheduledFlight = sf;
return f;
}).ToList();

results.TotalCount = multi.ReadFirst<int>();
}
}

return results;
}

Wrapping it up

Paged result sets is an important technique when dealing with large amounts of data. When using a full ORM like Entity Framework, this is implemented easily using LINQ’s Skip and Take methods. It’s so easy in fact that it can look a little like magic. In reality, it is actually very simple to write your own queries to support paged result sets and execute those queries using Dapper.

Using Vue as a drop-in replacement for Knockout in an ASP.NET MVC project

Originally posted to: http://www.davepaquette.com/archive/2019/01/21/using-vue-as-a-drop-in-replacement-for-knockout-in-an-MVC-project.aspx

So you’re working an existing (brown-field) ASP.NET MVC application. The application’s views are rendered server-side using Razor. Everything is working great and life is good. Suddenly, someone asks for a bit of additional functionality that will require some client side logic. Okay, no big deal. We do this all the time. The question though, is what framework/JavaScript library you will use to implement that client side functionality.

The default MVC project templates already include jQuery, so you might use that. You’ll probably end up writing a lot of code if you go down that path. Chances are, you will want to use a JavaScript framework that offers two-way data binding between the elements in the DOM to a your model data.

I seems that for many people, Knockout.js is the default library to use in these scenarios. I won’t get into the specifics but I think that Knockout is a little dated and that there are better options these days. If you want to dig into some of the issues with Knockout, you can read Simon Timms’ rant on the subject.

Vue.js

My fellow ASP.NET Monster James Chambers recently strongly recommended I take a look at Vue.js. I had been meaning to give Vue a try for some time now and I finally had a chance to use it on a recent project. Let me tell you…I love it.

I love it for a whole bunch of reasons. The documentation is great! It is super easy to get drop in to your existing project and it doesn’t get in the way. For what I needed to do, it just did the job and allowed me to get on with my day. It is also designed to be “incrementally adoptable”, which means you can start out with just using the core view layer, then start pulling in other things like routing and state management if/when you need them.

A simple example

I won’t go into great detail about how to use Vue. If you want a full tutorial, head on over to the Vue docs. What I want to show here is just how simple it is to drop Vue into an existing ASP.NET MVC project and add a bit of client side functionality.

The simplest example I can think of is a set of cascading dropdowns. Let’s consider a form where a user is asked to enter their Country / Province. When the Country is selected, we would expect the Province dropdown to only display the valid Provinces/States for the selected Country. That probably involves a call to an HTTP endpoint that will return the list of valid values.

public class ProvinceLookupController : Controller
{
public ActionResult Index(string countryCode)
{
var provinces = ProvinceLookupService.GetProvinces(countryCode);
return Json(provinces, JsonRequestBehavior.AllowGet);
}
}

Including Vue in your Razor (cshtml) view

The easiest way to include Vue on a particular Razor view is to link to the vue.js file from a CDN. You can add the following Script tag to your scripts section.

@section scripts  {
<script src="https://cdn.jsdelivr.net/npm/vue@2.5.22/dist/vue.js"></script>
}

Be sure to check the docs to make sure you are referencing the latest version of Vue.

Binding data to the our View

Now that you have included the core Vue library, you can start using Vue to bind DOM elements to model data.

Start by defining a Vue object in JavaScript. You can add this in a new <script> tag in your scripts section.

@section scripts  {
<script src="https://cdn.jsdelivr.net/npm/vue@2.5.22/dist/vue.js"></script>

<script type="text/javascript">
var app = new Vue({
el: '#vueApp',
data: {
selectedCountryCode: null,
countries: [
{ code: 'ca', name: 'Canada' },
{ code: 'us', name: 'United States' }
]
}
});
</script>
}

This Vue object targets the DOM element with id vueApp and contains some simple data. The currently selected country code and the list of countries.

Now, back in the HTML part of your csthml, wrap the form in a div that has an id="vueApp".

<div id="vueApp">
<!-- your form -->
</div>

Next, bind the <select> element to the data in your Vue object. In Vue, data binding is done using a combination of custom attributes that start with v- and the double curly bracket (aka. Mustache) syntax for text.

<div class="form-group">
@Html.LabelFor(model => model.CountryCode, new { @class = "control-label col-md-2" })
<div class="col-md-10">
<select id="CountryCode" name="CountryCode" class="form-control"
v-model="selectedCountryCode">
<option v-for="country in countries" v-bind:value="country.code">
{{ country.name }}
</option>
</select>
</div>
</div>

Now, when you run the app, you should see a dropdown containing Canada and United States.

Country Dropdown

Adding functionality

Next, you will want to add some client side logic to get the list of valid provinces from the server whenever the selected country changes.

First, add an empty provinces array and a selectedProvinceCode property to the Vue object’s data.

Next, add a method called countryChanged to the Vue object. This method will call the ProvinceLookup action method on the server, passing in the selectedCountryCode as a parameter. Assign the response data to the provinces array.

var app = new Vue({
el: '#vueApp',
data: {
selectedCountryCode: null,
countries: [
{ code: 'ca', name: 'Canada' },
{ code: 'us', name: 'United States' }
],
selectedProvinceCode: null,
provinces: []
},
methods: {
countryChanged: function () {
$.getJSON('@Url.Action("Index", "ProvinceLookup")?countryCode=' + this.selectedCountryCode, function (data) {
this.provinces = data;
}.bind(this));
}
}
});

Here I used jQuery to make the call to the server. In the Vue community, Axios is a popular library for making HTTP requests.

Back in the HTML, bind the change event from the country select element to the countryChanged method using the v-on:change attribute.

<select id="CountryCode" name="CountryCode" class="form-control" 
v-model="selectedCountryCode" v-on:change="countryChanged">
<option v-for="country in countries" v-bind:value="country.code">
{{country.name}}
</option>
</select>

Now you can add a select element for the provinces.

<div class="form-group">
@Html.LabelFor(model => model.ProvinceCode, new { @class = "control-label col-md-2" })
<div class="col-md-10">
<select id="ProvinceCode" name="ProvinceCode" class="form-control"
v-model="selectedProvinceCode">
<option v-for="province in provinces" v-bind:value="province.Code">
{{province.Name}}
</option>
</select>
</div>
</div>

Voila! You now have a working set of cascading dropdowns.

Country Dropdown

One last thing

You might want to disable the provinces dropdown whenever a request is being made to get the list of provinces for the selected country. You can do this by adding an isProvincesLoading property to the Vue object’s data, then setting that property in the countryChanged method.

var app = new Vue({
el: '#vueApp',
data: {
selectedCountryCode: null,
countries: [
{ code: 'ca', name: 'Canada' },
{ code: 'us', name: 'United States' }
],
selectedProvinceCode: null,
provinces: [],
isProvincesLoading: false
},
methods: {
countryChanged: function () {
this.isProvincesLoading = true;
$.getJSON('@Url.Action("Index", "ProvinceLookup")?countryCode=' + this.selectedCountryCode, function (data) {
this.provinces = data;
this.isProvincesLoading = false;
}.bind(this));
}
}
});

In your HTML, bind the disabled attribute to the isProvincesLoading property.

<div class="form-group">
<select id="ProvinceCode" name="ProvinceCode" class="form-control"
v-model="selectedProvinceCode"
v-bind:disabled="isProvincesLoading">
<option v-for="province in provinces" v-bind:value="province.Code">
{{province.Name}}
</option>
</select>

Putting it all together

Here is the entire cshtml file.

@{
ViewBag.Title = "Location Settings";
}
@model Mvc5VueJsExample.Models.LocationSettingsModel

<h2>@ViewBag.Title.</h2>
<h3>@ViewBag.Message</h3>

<div id="vueApp">
@using (Html.BeginForm("LocationSettings", "Home", FormMethod.Post, new { @class = "form" }))
{
<div class="form-group">
@Html.LabelFor(model => model.CountryCode, new { @class = "control-label col-md-2" })
<div class="col-md-10">
<select id="CountryCode" name="CountryCode" class="form-control"
v-model="selectedCountryCode" v-on:change="countryChanged">
<option v-for="country in countries" v-bind:value="country.code">
{{ country.name}}
</option>
</select>
</div>
</div>

<div class="form-group">
@Html.LabelFor(model => model.ProvinceCode, new { @class = "control-label col-md-2" })
<div class="col-md-10">
<select id="ProvinceCode" name="ProvinceCode" class="form-control"
v-model="selectedProvinceCode"
v-bind:disabled="isProvincesLoading">
<option v-for="province in provinces" v-bind:value="province.Code">
{{province.Name}}
</option>
</select>
</div>
</div>
<button class="btn btn-primary" type="submit">Save</button>
}

</div>
@section scripts {
<script src="https://cdn.jsdelivr.net/npm/vue@2.5.22/dist/vue.js"></script>

<script type="text/javascript">
var app = new Vue({
el: '#vueApp',
data: {
selectedCountryCode: null,
countries: [
{ code: 'ca', name: 'Canada' },
{ code: 'us', name: 'United States' }
],
selectedProvinceCode: null,
provinces: [],
isProvincesLoading: false
},
methods: {
countryChanged: function () {
this.isProvincesLoading = true;
$.getJSON('@Url.Action("Index", "ProvinceLookup")?countryCode=' + this.selectedCountryCode, function (data) {
this.provinces = data;
this.isProvincesLoading = false;
}.bind(this));
}
}
});
</script>
}

Wrapping it up

I hope this gives you a taste for how easy it is to work with Vue. My current thinking is that Vue should be the default choice for client side frameworks in existing ASP.NET MVC apps.

NOTE: This example uses ASP.NET MVC 5 to illustrate that Vue can be used with brownfield applications. It would be just as easy, if not easier, to use Vue in an ASP.NET Core project.

Blazor Geolocation updated

There have been significant changes over the past year or so since we published our Geolocation API package for Blazor 0.3. Today we’re pushing version 0.4 which updates the package for use with Blazor 0.7.

There were quite a few changes to the JavaScript interop APIs since we last published the package - all of them really good. However they were, as you get with alpha software, breaking changes. We now do a better of of serializing payloads too.

Usage

1) In your Blazor app, add the AspNetMonsters.Blazor.Geolocation NuGet package

Install-Package AspNetMonsters.Blazor.Geolocation -IncludePrerelease

1) In your Blazor app’s Startup.cs, register the ‘LocationService’.

public void ConfigureServices(IServiceCollection services)
{
...
services.AddSingleton<LocationService>();
...
}

1) Now you can inject the LocationService into any Blazor page and use it like this:

@using AspNetMonsters.Blazor.Geolocation
@inject LocationService LocationService
<h3>You are here</h3>
<div>
Lat: @location?.Latitude <br/>
Long: @location?.Longitude <br />
Accuracy: @location?.Accuracy <br />
</div>

@functions
{
Location location;

protected override async Task OnInitAsync()
{
location = await LocationService.GetLocationAsync();
}
}

Installing an Azure Web App Site Extension with PowerShell

Originally posted to: http://www.davepaquette.com/archive/2018/11/13/installing-an-azure-web-app-site-extension-with-powershell.aspx

I recently ran into a scenario where I needed to script the installation of a site extension into an existing Azure Web App. Typically, I would use an Azure ARM deployment to accomplish this but in this particular situation that wasn’t going to work.

I wanted to install the site extension that enables Application Insights Monitoring of a live website. After digging into existing arm templates, I found the name of that extension is Microsoft.ApplicationInsights.AzureWebSites.

After searching for way too long, I eventually found PowerShell command I needed on a forum somewhere. I can’t find it again so I’m posting this here in hopes that it will be easier for others to find in the future.

Installing a site extension to an existing App Service Web App

New-AzureRmResource -ResourceType "Microsoft.Web/sites/siteextensions" -ResourceGroupName MyResourceGroup -Name "MyWebApp/SiteExtensionName" -ApiVersion "2018-02-01" -Force

For example, given a resource group named Test, a web app named testsite and a site extension named Microsoft.ApplicationInsights.AzureWebSites.

New-AzureRmResource -ResourceType "Microsoft.Web/sites/siteextensions" -ResourceGroupName "Test" -Name "testsite/Microsoft.ApplicationInsights.AzureWebSites" -ApiVersion "2018-02-01" -Force

Installing a site extension to a Web App Deployment Slot

The scenario I ran into was actually attempting to add this site extension to a deployment slot. When you create a deployment slot, it doesn’t copy over any existing site extensions, which is a problem because when you swap your new slot to production, your new production slot ends up losing the site extensions that were in the old production slot.

New-AzureRmResource -ResourceType "Microsoft.Web/sites/slots/siteextensions" -ResourceGroupName MyResourceGroup -Name "MyWebApp/SlotName/SiteExtensionName" -ApiVersion "2018-02-01" -Force

Using the same example as above and a slot named Staging:

New-AzureRmResource -ResourceType "Microsoft.Web/sites/slots/siteextensions" -ResourceGroupName "Test" -Name "testsite/Staging/Microsoft.ApplicationInsights.AzureWebSites" -ApiVersion "2018-02-01" -Force

Loading Related Entities with Dapper Many-to-One - Part 2

Originally posted to: https://www.davepaquette.com/archive/2018/04/10/loading-related-entities-many-to-one-part-2.aspx

This is a part of a series of blog posts on data access with Dapper. To see the full list of posts, visit the Dapper Series Index Page.

Update: April 16, 2018 Something really cool happened in the comments. The amazing Phil Bolduc very kindly pointed out that the query I wrote was not optimal and as a result, my benchmarks were not showing the best results. He didn’t stop there, he also submitted a pull request to the sample repo so I could rerun my benchmarks. Great job Phil and thanks a ton for being constructive in the comments section! I have updated the post to include Phil’s superior query.

In today’s post, we look at another option for how to load Many-to-One relationships. In the last post, we used a technique called Multi-Mapping to load related Many-to-One entities. In that post, I had a theory that maybe this approach was not the most efficient method for loading related entities because it duplicated a lot of data.

Many-to-One

To recap, we would like to load a list of ScheduledFlight entities. A ScheduleFlight has a departure Airport and an arrival Airport.

public class ScheduledFlight 
{
public int Id {get; set;}
public string FlightNumber {get; set;}

public Airport DepartureAirport {get; set;}
public int DepartureHour {get; set;}
public int DepartureMinute {get; set;}

public Airport ArrivalAirport {get; set;}
public int ArrivalHour {get; set;}
public int ArrivalMinute {get; set;}

//Other properties omitted for brevity
}

public class Airport
{
public int Id {get; set;}
public string Code {get; set;}
public string City {get; set;}
public string ProvinceState {get; set;}
public string Country {get; set;}
}

Using Multiple Result Sets

In the previous post, we loaded the ScheduledFlight entities and all related Airport entities in a single query. In this example we will use 2 separate queries: One for the ScheduledFlight entities, one for the related arrival and departure Airport entities. These 2 queries will all be executed as a single sql command that returns multiple result sets.

SELECT s.Id, s.FlightNumber, s.DepartureHour, s.DepartureMinute, s.ArrivalHour, s.ArrivalMinute, s.IsSundayFlight, s.IsMondayFlight, s.IsTuesdayFlight, s.IsWednesdayFlight, s.IsThursdayFlight, s.IsFridayFlight, s.IsSaturdayFlight,
s.DepartureAirportId, s.ArrivalAirportId
FROM ScheduledFlight s
INNER JOIN Airport a1
ON s.DepartureAirportId = a1.Id
WHERE a1.Code = @FromCode

SELECT Airport.Id, Airport.Code, Airport.City, Airport.ProvinceState, Airport.Country
FROM Airport
WHERE Airport.Id = @DepartureAirportId
OR Airport.Id IN (SELECT s.ArrivalAirportId
FROM ScheduledFlight s
WHERE s.DepartureAirportId = @DepartureAirportId)

Using Dapper’s QueryMultipleAsync method, we pass in 2 arguments: the query and the parameters for the query.

public async Task<IEnumerable<ScheduledFlight>> GetAlt(string from)
{
IEnumerable<ScheduledFlight> scheduledFlights;
using (var connection = new SqlConnection(_connectionString))
{
await connection.OpenAsync();
var query = @"
SELECT s.Id, s.FlightNumber, s.DepartureHour, s.DepartureMinute, s.ArrivalHour, s.ArrivalMinute, s.IsSundayFlight, s.IsMondayFlight, s.IsTuesdayFlight, s.IsWednesdayFlight, s.IsThursdayFlight, s.IsFridayFlight, s.IsSaturdayFlight,
s.DepartureAirportId, s.ArrivalAirportId
FROM ScheduledFlight s
INNER JOIN Airport a1
ON s.DepartureAirportId = a1.Id
WHERE a1.Code = @FromCode

SELECT Airport.Id, Airport.Code, Airport.City, Airport.ProvinceState, Airport.Country
FROM Airport
WHERE Airport.Id = @DepartureAirportId
OR Airport.Id IN (SELECT s.ArrivalAirportId
FROM ScheduledFlight s
WHERE s.DepartureAirportId = @DepartureAirportId)";

using (var multi = await connection.QueryMultipleAsync(query, new{FromCode = from} ))
{
scheduledFlights = multi.Read<ScheduledFlight>();
var airports = multi.Read<Airport>().ToDictionary(a => a.Id);
foreach(var flight in scheduledFlights)
{
flight.ArrivalAirport = airports[flight.ArrivalAirportId];
flight.DepartureAirport = airports[flight.DepartureAirportId];
}

}
}
return scheduledFlights;
}

The QueryMultipleAsync method returns a GridReader. The GridReader makes it very easy to map mutliple result sets to different objects using the Read<T> method. When you call the Read<T> method, it will read all the results from the next result set that was returned by the query. In our case, we call Read<ScheduledFlight> to read the first result set and map the results into a collection of ScheduledFlight entities. Next, we call Read<Airport> to read the second result set. We then call ToDictionary(a => a.Id) to populate those Airport entities into a dictionary. This is to make it easier to read the results when setting the ArrivalAirport and DepartureAirport properties for each ScheduledFlight.

Finally, we iterate through the scheduled flights and set the ArrivalAirport and DepartureAirport properties to the correct Airport entity.

The big difference between this approach and the previous approach is that we no longer have duplicate instances for Airport entities. For example, if the query returned 100 scheduled flights departing from Calgary (YYC), there would be a single instance of the Airport entity representing YYC, whereas the previous approach would have resulted in 100 separate instances of the Airport entity.

There is also less raw data returned by the query itself since the columns from the Airport table are not repeated in each row from the ScheduleFlight table.

Comparing Performance

I had a theory that the multi-mapping approach outlined in the previous blog post would be less efficient than the multiple result set approach outlined in this blog post, at least from a memory usage perspective. However, a theory is just theory until it is tested. I was curious and also wanted to make sure I wasn’t misleading anyone so I decided to test things out using Benchmark.NET. Using Benchmark.NET, I compared both methods using different sizes of data sets.

I won’t get into the details of Benchmark.NET. If you want to dig into it in more detail, visit the official site and read through the docs. For the purposes of this blog post, the following legend should suffice:

Mean      : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
Gen 0 : GC Generation 0 collects per 1k Operations
Gen 1 : GC Generation 1 collects per 1k Operations
Gen 2 : GC Generation 2 collects per 1k Operations
Allocated : Allocated memory per single operation (managed only, inclusive, 1KB = 1024B)

10 ScheduledFlight records

Method Mean Error StdDev Gen 0 Allocated
MultiMapping 397.5 us 3.918 us 4.192 us 5.8594 6.77 KB
MultipleResultSets 414.2 us 6.856 us 6.077 us 4.8828 6.69 KB

As I suspected, the difference is minimal when dealing with small result sets. The results here are in microseconds so in both cases, executing the query and mapping the results takes less 1/2 a millisecond. The mutliple result sets approach takes a little longer, which I kind of expected because of the overhead of creating a dictionary and doing lookups into that dictionary when setting the ArrivalAirport and DepartureAirport properties. The difference is minimal and in a most real world scenarios, this won’t be noticable. What is interesting is that even with this small amount of data, we can see that there is ~1 more Gen 0 garbage collection happening per 1,000 operations. I suspect we will see this creep up as the amount of data increases.

100 ScheduledFlight records

Method Mean Error StdDev Gen 0 Gen 1 Allocated
MultiMapping 1.013 ms 0.0200 ms 0.0287 ms 25.3906 5.8594 6.77 KB
MultipleResultSets 1.114 ms 0.0220 ms 0.0225 ms 15.6250 - 6.69 KB
Method Mean Error StdDev Gen 0 Allocated
MultiMapping 926.5 us 21.481 us 32.804 us 25.3906 6.77 KB
MultipleResultSets 705.9 us 7.543 us 7.056 us 15.6250 6.69 KB

When mapping 100 results, the multiple result sets query is already almost 25% faster. Keep in mind though that both cases are still completing in less than 1ms so this is very much still a micro optimization (pun intented). Either way, less than a millsecond to map 100 records is crazy fast.

1000 ScheduledFlight records

Method Mean Error StdDev Gen 0 Gen 1 Allocated
MultiMapping 5.098 ms 0.1135 ms 0.2720 ms 148.4375 70.3125 6.77 KB
MultipleResultSets 2.809 ms 0.0549 ms 0.0674 ms 109.3750 31.2500 6.69 KB

Here we go! Now the multiple result sets approach finally wins out, and you can see why. There are way more Gen 0 and Gen 1 garbage collections happening per 1,000 operations when using the multi-mapping approach. As a result, the multiple result sets approach is nearly twice as fast as the multi mapping approach.

10,000 ScheduledFlight records

Method Mean Error StdDev Gen 0 Gen 1 Gen 2 Allocated
MultiMapping 56.08 ms 1.5822 ms 1.4026 ms 1687.5000 687.5000 187.5000 6.78 KB
MultipleResultSets 24.93 ms 0.1937 ms 0.1812 ms 843.7500 312.5000 125.0000 6.69 KB

One last test with 10,000 records shows a more substantial difference. The multiple result sets approach is a full 22ms faster!

Wrapping it up

I think that in most realistic scenarios, there is no discernable difference between the 2 approaches to loading many-to-one related entities. If you loading larger amounts of records into memory in a single query, then the multiple result sets approach will likely give you better performance. If you are dealing with < 100 records per query, then you likely won’t notice a difference. Keep in mind also that your results will vary depending on the specific data you are loading.