I found the wording of the Team Foundation Server post-install to be a tad demanding:
“The following package installation requires an immediate restart of the machine: Microsoft .NET Framework 4.51.”:
The holiday season is upon us yet again. Typically this is a slow period, while people escape the office for the respite offered over the Christmas/New Year period.
It’s been a long year for me, and the holiday season is no different. I’m off on leave for around half of December, so I might not be posting as much as I normally do.
Therefore, have a safe and pleasant end of year, and look forward to more from Sanders Technology in the new year.
The following statement was issued by ISOC:
Internet Society Expresses Concern over Impact of Intellectual
Property Rights Provisions in Trans-Pacific Partnership Agreement
18 November 2013
The Internet Society is concerned that the global Internet may be harmed if countries adopt Intellectual Property Rights (IPR) provisions contained in the recently leaked Trans-Pacific Partnership Agreement (TPP) draft. We do not believe that these provisions are consistent with basic principles of transparency, due process, accountability, proportionality and the rule of law.
The leaked TPP Agreement is a complex set of rights and principles related to IPR and we believe that the current draft reflects a disproportionate balance of rights in favor of intellectual property owners. In addition to other issues, these provisions could also have important consequences for online privacy, a critical dimension in light of heightened awareness worldwide about the importance of protecting the privacy and security of end-users.
In particular, with respect to intermediary liability, some of the articles appear to assign new levels of responsibility to private entities and create an environment where content will be subject to extensive filtering. Some draft provisions would impose an unparalleled set of conditions on intermediaries that would allow them to escape liability and could ultimately lead to content blocking and affect legitimate speech and online expression.
Such measures are neither new nor original; they have appeared in similar forms in other national or international contexts. On the whole, these measures have proven to be inefficient or unworkable. They have failed to adequately address the stated problems or to provide sufficient answers to the existing challenges.
The Internet Society has advocated for intellectual property discussions to adhere to minimum standards of process and substance. In June 2013, we released a paper in which we called on the international community to apply standards such as transparency, due process, accountability and compliance to the rule of law to all
intellectual property discussions that relate to the Internet. Similarly, we have been vocal in advancing these principles in various for a, including the World Intellectual Property Organization (WIPO), the Internet Governance Forum (IGF) and the Organization for Economic Co-Operation and Development (OECD).
We also joined other organizations1 in a statement made in 2012, urging the negotiators of the TPP “to make [the] process more transparent and inclusive, following the multi-stakeholder model, at least for those chapters of the agreement pertaining to the Internet.”
Throughout this process, the Internet Society has taken the position of not commenting on substantive issues based on leaked texts. At the time, we understood that the leaked texts provided only a snapshot of
the issues while many provisions were omitted.
The most recent leak, released by Wikileaks, appears to be the complete draft of the TPP’s Intellectual Property chapter and has made us reconsider our position.
That we feel compelled to comment on leaked versions of the TPP demonstrates that these basic process standards have been ignored. In an era where the global economy depends on information and networks,
we believe that discussions that affect the Internet and its users should reflect these basic principles of transparency and openness.
Once again, the Internet Society calls upon the TPP negotiators to abide by standards of transparency as they complete this critical international agreement that will impact Internet users worldwide. We also urge the negotiating parties to reconsider the TPP’s intellectual property provisions and to ensure they don’t have a negative impact on
innovation, creativity, prosperity and market participation.
1 The Electronic Frontier Foundation (EFF), InternetNZ, Knowledge Ecology International (KEI), Open Media, Global Voices Advocacy and the International Federation of Libraries and Archives (IFLA).
If you have used or are deciding on using an Azure hosted SQL Server database, you might also want the ability to govern the identities and credentials used to access your data.
When you create a SQL Database in SQL Azure, initially credentials are created for you, but they allow you to access and govern the entire SQL Azure instance. If you’d prefer to partition access to specific databases, how do you create and manage accounts? I’ll walk you through it.
There’s two ways you can manage credentials with SQL Azure, you can connect using SQL Management Studio or you can use the Windows Azure management console (browser based).
Each tool offers a slightly different value proposition – the browser based experience is a little slower and less intuitive, but can be accessed from anywhere without requiring SSMS to be installed. SSMS gives you integration with SSMS projects and also provides for more advanced tools, like query parsing.
No matter which tool you use, you’ll need to use TransactSQL to manage server logins and database user accounts. Let’s take a look at the T-SQL syntax you’ll need.
SQL Azure unfortunately does not offer a graphical user interface for the management of logins and users. This means you’ll have to revert to running T-SQL queries if you wish to manipulate credentials.
Logins are server wide accounts which use a simple username + password combination.
CREATE LOGIN testlogin WITH password='<APassword>’;
To create logins, you need to connect to the SQL Azure Database’s master database the administrative account(which is created when you create the initial SQL Azure database). There are some restrictions on names you can use when creating logins, these are listed here: Login Name Restrictions.
Note that passwords are subject to certain complexity requirements.
Users are created within the context of a single database and are linked to logins. You must connect to the database you want to create the user in, before running the following T-SQL:
CREATE USER testuser FROM LOGIN testlogin;
A user identity alone doesn’t grant any access or permissions, you’ll need to assign some permissions manually, once the user has ben successfully created. We do this as we do with standard SQL Server, by assigning database roles, e.g the following grants read/write access to the named user account for the database in context.
EXEC sp_addrolemember 'db_datareader', 'testuser';
EXEC sp_addrolemember 'db_datawriter', 'testuser';
..and conversely, to revoke a permission:
EXEC sp_droprolemember 'db_datareader', 'testuser';
EXEC sp_droprolemember 'db_datawriter', 'testuser';
To remove a user or a login requires the execution of a T-SQL drop statement. To remove an unwanted login, use the following syntax on the master database:
drop user testuser;
drop login testlogin;
Note that if you are using SSMS to administer, the GUI gives you the usual options to remove/delete users via the Object Explorer, as you would with standard SQL Server.
The easiest way to access SQL Azure is from the Azure Portal. Authenticate using your Microsoft Account and then locate the SQL Database tab:
You want the Server (top, right option) and click on the server name.
From here, click on the dashboard option, and make note of points #1-#4, as illustrated above. You can launch the management console by clicking on the link located at point #3.
Once it loads (can take a while) you need to enter the administrative login details (#2) and the correct password. If you can’t remember the password, you can reset it from the portal (#4), but bear in mind anything using the current password will presumably break, since the login and password are used in conjunction for access via ODBC/SSMS etc.
Once you’ve authenticated successfully, click the “Select a Database” option on the top left hand side, and pick “master” (to administer logins) or select a database to manage users/permissions.
You’ll usually get an error, but this is OK, you just want to be in the master database context (see top left).
Click on ‘New Query’ and you can then execute T-SQL to create/drop logins.
Create user (not in master database)
Fire up SQL Management Studio (2008 R2, 2012, 2014) and with the Database Engine connect window, use the following:
Note that you’ll need access via port 1433. Don’t forget to authorise your origin IP address too! Otherwise you’ll get this sort of error message:
You can grant access by IP address via the Azure portal, go to the server (as above) but click on the “Configure” option. You can grant your current IP address easily enough. Don’t forget to click on the Save option before leaving!
Once you have successfully authenticated/connected, you can manage both the SQL Azure master database and any custom databases fairly easily. Note that you still need to execute T-SQL, but the SSMS option can generate the T-SQL for you:
You can also walk the Object Explorer to get to the Security branch, and manipulate the objects contained within. Reminder: there’s no GUI support as there is with standard SQL Server.
Hopefully this is all the info you need to successfully manage logins and users in SQL Azure.
Please bear in mind that Azure can change rapidly, these screenshots may not be valid at any point in the future. this article was published in November 2013, and the instructions were valid at this time.
Adding Users to Your SQL Azure Database – http://blogs.msdn.com/b/sqlazure/archive/2010/06/21/10028038.aspx
Managing Databases and Logins in Windows Azure SQL Database – http://msdn.microsoft.com/en-us/library/ee336235.aspx
Last week I gave an internal presentation to my fellow consultants at CGI on the principals of data modelling/data architecture, modelling within Visual Studio 2013 and a history of the (ADO.NET) Entity Framework.
I’ve attached the slide deck to this article, and also in my presentations page.
Once we get past the initial introductions, I dove into some of the fundamental principles of data access design. These are the key design considerations which every mature solution should take into consideration – particularly when building a design which marries to the ACID principles.
This part of the presentation wasn’t directly related to either data modelling in Visual Studio or implementing the Entity Framework ORM, but I don’t think it is a bad idea to restate some core data access design goals where possible.
Once we were past the concepts, we went straight into…
To be honest with you, I forced myself to use Visual Studio’s out-of-the-box database project as a design tool instead of jumping into SQL Management Studio as I normally would. Partly, this was to give the tools some fair use – the designer support is still a bit sluggish – but there’s still some niceties to be had here.
The latest incarnation has some decent and attractive features, the SQL Compare functionality is simply superb for harmonizing T-SQL based on instances or other code repositories, and the T-SQL import wizard helps with getting projects up and running quickly.
Possibly the best feature is the publishing wizard, which you can use to easily deploy to SQL Azure or to instances; or to run as part of an automated build.
Besides showing how the entity model is generated from the database schema, I wanted to impress upon the audience the costs vs. benefits of adopting an ORM solution – particularly focused on the quick wins against the limitations and potential performance problems.
Ultimately this lead into a review of a generic interface pattern which I’ve been working on for the past few weeks, and some of the power of consolidating common data access methods (e.g. Create, Read, Update and Delete) into a common implementation using generics.
At the end, I was planning to surprise the audience by “live switching” from accessing a local SQL instance to querying data from SQL Azure by simply changing a connection string, but due to having to move rooms at the last minute, the 4G connection I was using hadn’t been authorised on the SQL Azure Database, so the surprise failed.
The awesome takeaway (blown surprise aside) was that using the Entity Framework, there was no need to do any recompilation – the model worked seamlessly with local and Azure-based data stores. I guess I’ll have to save that surprise for another audience at another time.
To be honest, I should have split this into two presentations. There’s so much to discuss when it comes to decent data design principles, so I could have talked about those and data modelling in a single session. The Entity Framework represents a large body of work in its own right, I could speak for hours about how it can be adapted and extended.
We didn’t even scratch the surface.. This may lead to a follow-up presentation potentially. Here’s the slide deck from the day.