jQuery Ajax is async by nature. We use to set a flag “async:false” if we need to make Ajax call sync (non-blocking). This feature has been deprecated. Here is a compile list of Ajax using jQuery;
I am using EF for most of my ORM and data access. Recently I have tries Dapper and started liking it because of its simplicity and small footprint. Here is a list of example;
Using Dapper to fill a dataset;
Dapper returns a IDataReader when we use the ExecuteReaderAsync method. More information on this addition can be found here and here.
Use Nuget package to add Dapper. Add this to your Main class for quick demo;
static IDbConnection dbConn = new SqlConnection(ConfigurationManager.ConnectionStrings["SqlServerConnString"].ConnectionString);
Here you go with DataSet example using Dapper;
public async Task<DataSet> GetUserInformationOnUserId(int UserId)
{
var storedprocedure = "usp_getUserInformation";
var param = new DynamicParameters();
param.Add("@userId", UserId);
var list = await SqlMapper.ExecuteReaderAsync(dbConn, storedprocedure, param, commandType: CommandType.StoredProcedure);
var dataset = ConvertDataReaderToDataSet(list);
return dataset;
}
Here is conversion to dataset method;
public DataSet ConvertDataReaderToDataSet(IDataReader data)
{
DataSet ds = new DataSet();
int i = 0;
while (!data.IsClosed)
{
ds.Tables.Add("Table" + (i + 1));
ds.EnforceConstraints = false;
ds.Tables[i].Load(data);
i++;
}
return ds;
}
Recently I have to move varbinary(max) data from one database to another database using script component.
When dealing with varbinary(max), there are two scenarios:
the length of the data is moderate
the length of the data is big
GetBytes() is intended for the this scenario, when we are using CommandBehaviour.SequentialAccess to ensure that we are streaming the data, not buffering it. In particular, in this usage we would usually be writing (for example) in a stream, in a loop. For example:
// moderately sized buffer; 8040 is a SQL Server page, note
byte[] buffer = new byte[8040];
long offset = 0;
int read;
while((read = reader.GetBytes(col, offset, buffer, 0, buffer.Length)) > 0) {
offset += read;
destination.Write(buffer, 0, read); // push downstream
}
However! If we are using moderately sized data, then use this code:
byte[] data = (byte[])reader[col];
Obviously the output column data type would be “image [DT_IMAGE]” in script component.