Q In the October 2003 issue of MSDN® Magazine, you suggested programmatically executing a Data Transformation Services (DTS) package. You failed to mention the "two lines of code" required to execute the command. Can you help?
A The two lines required to programmatically execute a DTS package really represent two steps. The first step is to load the package (from any of the three places a package can be saved: SQL Server®, Meta Data Services, or a file). The next step is to execute it. A complete program actually requires a few additional lines, of course, to declare variables and so forth.
Complete documentation for the DTS programmability object model can be found in SQL Server Books Online.
After setting a COM reference to the Microsoft DTSPackage Object Library and adding a using statement at the top of the C# module, the C# code would look like Figure 1.
There are a number of Visual Basic® and Visual Basic .NET articles on doing DTS programmatically, but not many on C#. A good article for C# that shows code like this, and also shows how to take things to the next step, is Knowledge Base article 319985 "Handle Data Transformation Services Package Events in Visual C# .NET".
A good troubleshooting article on problems that occur when programmatically executing DTS packages is Knowledge Base article 323685 "Troubleshoot DTS Packages That You Run from Visual Basic Applications".
Q I have an ASP.NET application talking to a Web Service. I am getting this intermittent error: "Underlying connection closed error. An unexpected error occurred on send." The architecture includes a third-party hardware load balancer which routes the calls to the Web Service.
After some troubleshooting, I found that the application works fine if the load balancer is not used. It turns out that the load balancer redirects clients to an available server if there isn't any activity from this client for one minute. How can I fix this problem?
A If the ASP.NET Web app and the Web Service are both on the load balancer, you should know that it is not a good idea to be load balancing the ASPX-ASMX call. The client is already load balanced to a particular node when they hit the ASPX page; going back to the load balancer to get routed to another machine to call the ASMX page will not be good for performance. Furthermore, the idea of invoking a Web Service over the local machine will hurt performance. Instead you should simply add a standard reference to the DLL and invoke the code directly. It serves as a workaround.
Q I built a Web Service and added a public property to the class. When I generate the proxy class in my consumer ASP.NET app, this property doesn't come along with it. Do I have to add an attribute to the property the same way as you add [WebMethod] to Web Service methods?
A It doesn't make much sense to expose public properties on a stateless, distributed request. It has been a discouraged practice since Microsoft® Transaction Server 1.0, although the platform still supported it to an extent. That said, there are some situations in which you may want to use "properties" in a Web Service, such as when you're passing context or data that's consistent across all Web methods. You can mimic properties in a Web Service by using a SOAP header. It would look like the code shown in Figure 2. The Web client would look like this:
ws3.BillingService3 service = new ws3.BillingService3();
ws3.Properties props = new ws3.Properties();
props.FirstName = "John";
props.LastName = "Doe";
service.PropertiesValue = props;
service.UpdateOrder("hello", "world", 1, true);
Q I am authoring a Web page in FrontPage 2000 that connects to a Microsoft Access database that resides on another machine. The IUSR and IWAM accounts are local on both machines with the same name and password. This machine is running Windows NT® 4.0 with IIS with SP 6a; the other machine running IIS has FrontPage® 2000 installed on it.
For some reason this does not work and I get the following error: "The Microsoft Jet Database Engine cannot open the file '(unknown).' It is already opened exclusively by another user or you need permission to view."
By switching the roles of each of these machines (making the IIS the file server and the file server the IIS), everything works fine. What's wrong here?
A Did you give the user's (IUSR/IWAM) permissions to create files in the folder that the Access database lives in? When Access databases are opened, they create a file to mark who is accessing the database. If you cannot create the file, you will get this error and sometimes you even have to give your current user profile temp catalog relaxed permissions, depending on where the database lives. Note that if you open the file as read-only/exclusive, then no file needs to be created. In addition, you can get this error with Access if your memory is fragmented on the server running IIS. When you open an Access database via Jet, Jet attempts to allocate 64MB of virtual memory in one continuous block; if this fails, you get the Unknown error as well.
Q Is there a limit to the size or number of JScript® files in Microsoft Internet Explorer? Here's why I ask: I have many JScript files included in an HTML page. In Internet Explorer 5.5 SP2 with a slow network connection, I sometimes get "object expected" errors (refering to a function in a .js file). I confirmed that the file is in the cache, but when it loads in the debugger, it is empty.
I considered a workaround to add the following exception handling code, but I don't know if this is a valid approach:
Try
{
// call a test
}
catch
{
//insert <SCRIPT language=JavaScript src="test.js"></SCRIPT> again.
}
A This isn't a good workaround because if Internet Explorer does in fact reload the script this way you could wind up with an endless loop of loading scripts. Scripts that load themselves would be a potential problem due to unchecked resource allocation. In addition, the Document Object Model isn't guaranteed to be ready for use before the body OnLoad event has fired.
Since scripts are executed in Internet Explorer sequentially, to see if your script has loaded you could set a sentinel variable:
<script language="JavaScript">
function myfunction {}
•••
// script is finished *loading*
myScriptHasLoaded = true;
</script>
To answer your first question, there is one known app out there that has 100,000 lines of JScript code in an HTA application. Of course this is very bad, but it does work, and they haven't hit a limit in Internet Explorer yet.
Q I have a single domain and need to implement a certain password complexity criteria for only one group of users. What are the best practices around this?
A You need to implement a password filter, which are standard DLLs invoked by the security provider each time a user changes his password. Passwords may be accepted or refused, based on custom policy.
Now you probably are wondering how a password filter can facilitate multiple password policies within a single domain. Policy is stored within the account database that stores the accounts, so there can only be a single policy for each domain or a single policy for the local account database. This may change in the future, but right now that's how it works.
However, it sounds like you just need different complexity requirements for your passwords (such as minimum length, maximum length, strong password, and so on). This can be accomplished with a password filter, which will let you apply your custom policy (even more complex or flexible than the one provided by Windows) to your specific groups.
When a password filter is invoked, it is provided with the user name of the user who wants to change the password, along with the new password. So the filter can check which groups it belongs to and, according to the specific settings with which your password filter has been configured, decide which type of policy to apply and whether to accept or refuse the new password.
Just remember that there can only be a single password policy for each account database; that means a single policy for each domain. Individual computers within the domain can have unique password policies, but those will only affect local accounts on that computer. There is currently no way around this, but that may change in the future.
Q I have a question regarding the use of DataAdapter classes in the Microsoft .NET Framework. One of my engineers has suggested that, based on his research, my team should be using data adapters in our Web Services instead of having stored procedures return data into our datasets. Others have looked at this issue and come to the conclusion that data adapters are primarily intended for traditional 32-bit client/server applications, not Web-based applications. Which approach is better?
A Basically, you are asking if data adapters are best used for Visual Basic .NET rich client apps where you're maintaining your database connection rather than in a Web Service environment where you create the connection, use it, and then drop it.
One way to look at it is that since ADO.NET favors a disconnected model, there is no reason not to use them in Web-based apps. Additionally, the adapter can leverage existing stored procedures, manage the connection, and cut out a bunch of ADO.NET code that would otherwise have to be written and maintained. You seem somewhat concerned that you could have concurrency issues with the data adapters.
Here is some real-world experience to help answer your concerns about the role of data adapters in a large scale, enterprise Web Services application considering performance, concurrency, and code maintenance. The .NET Data Access Architecture Guide on the MSDN Patterns and Practices Developer Center offers good information on performance. MSDN also has an article which contains some actual measurements from simulated scenarios (see Performance Comparison: Data Access Techniques).
A data reader can offer better scalability and performance if used appropriately. The longer you hold a connection, the more scalability suffers. Whether it makes a difference in a specific situation depends on a lot of factors.
Stored procedures are recommended over dynamic SQL unless you want to turn your database server into a database query compilation engine. Again, the .NET data access architecture guide discusses this.
An additional factor is how the resulting data is to be transported from one tier to the other. If you need to remote your data tier and transfer the resulting data to a client, then a data reader is not the way to go. However, be aware of the serious performance issues you will encounter if the resulting dataset is large. If you use data adapters, then you should make sure the connections are closed when the request is completed.
The real issue here is that the resulting dataset performance is more a function of dataset complexity than of size. While more data obviously increases the payload, the complexity of datasets can be even more of an issue.
Further, consider your methods of transferring data between client and server. When you have highly complicated data structures (such as complex datasets), the Web Service means of serialization offers much better performance than remoting serialization.
In response to the initial question, you may be confusing their concepts. Datasets always use data adapters if they are getting data from another dataset. Data adapters do not create some kind of connected state to the data. The reasons not to use data readers are really scalability and performance, as noted.
Q What is the best way to store an XML file (as BLOB data) in SQL Server? Should I transform and traverse the XML, and after parsing it store it in a meaningful format in database tables? Or should I store RAW XML BLOB data in a field (text field) in a table, or use something else completely?
A The answer depends on the structure of the data and the types of queries. For example, if it is representing mainly relational data encoded into XML and you want to query it and return parts, then shred it into relational tables using annotated schema, bulkload, or OpenXML. Otherwise, if you always want the same XML document in full without complex queries (XML structure is markup-centric and does not easily map into relations), then use a TEXT/NTEXT field.
Q Is there any support for an XML data type in SQL Server 2000? Specifically, I want to push a small collection of elements into a field as XML and then issue queries that test for the presence of some elements. I would like to be able to do an XPath query on each row that contained an XML field.
A While there is no XML datatype in SQL Server 2000, you can store XML in nvarchar columns and use OpenXML to test for the presence or absence of paths if you can live with the following limitations. First, all of the XML documents must fit into an nvarchar/varchar column (4000 Unicode characters, 8000 single-byte characters maximum length). Second, you have to use a cursor inside a stored procedure to iterate over every column so that you can use sp_xml_preparedocument to parse the XML (and use sp_xml_removedocument to clean the memory). This limitation could unnecessarily shift complexity from managed code to procedural T-SQL—a trade-off that you have to decide upon.
Of course, you can do the parse/XPath execution using the System.XML component inside the managed code on the middle tier. The OpenXML part is only required if you need to process your data in SQL. On the middle tier, you have more flexibility, but the cost is that you have to transfer all your XML data to the middle tier for processing. Also note that in the upcoming version of SQL Server, code-named "Yukon," there will be native XML support. See Introducing SQL Server 2005 for more information.
Got a question? Send questions and comments to webqa@microsoft.com.
Thanks to the following Microsoft developers for their technical expertise: Earl Beaman, Steve Beaugé, Efi Bregman, Guillaume Cadet, Mark Cleary, Denny Dayton, Kurt Dillard, Naphtaly Friedman, Miguel Isidoro, Anders Janson, Michael Kaplan, Bhalinder Kehal,Andreas Klein, Rogvi Knudsen, Douglas Laudenschlager, Chun Liang, Dan Liebling, Rick Lievano, Maurizio Macagno, Bill Maurer, Matt Neerincx, Mauro Ottaviani, Malini Puttaiah, David Qiu, Pavel Rozalski, Michael Rys, K. Saravana, Chad Sheffield, Louis Weinstein