Search Results

Search found 144001 results on 5761 pages for 'sql server data tools'.

Page 147/5761 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • PowerShell & SQL Compare

    - by Grant Fritchey
    Just a quick blog post to share a couple of scripts for using PowerShell to call SQL Compare. This is an example from my session at SQL in the City on setting up a sandbox development process. This just runs a compare between a set of scripts and a database and deploys it. set-Location “c:\Program Files (x86)\Red Gate\SQL Compare 10\”; ./sqlcompare /s2:DOJO /db2:MovieManagement_Sandbox /sourcecontrol1 /vu1:grant /vp1:12345 /r1:HEAD /sfx:scripts.xml /sync /mfx:migrations.xml /verbose; I would not recommend using the /verbose output for real automation, but I’m showing off how the tool works. This particular script does a compare straight from source control to a database on my server. You can use variables where I’ve hard coded. That’s it. Works great. Just wanted to share it out there. I have others that I’ll track down and put up here.  

    Read the article

  • Checking who is connected to your server, with PowerShell.

    - by Fatherjack
    There are many occasions when, as a DBA, you want to see who is connected to your SQL Server, along with how they are connecting and what sort of activities they are carrying out. I’m going to look at a couple of ways of getting this information and compare the effort required and the results achieved of each. SQL Server comes with a couple of stored procedures to help with this sort of task – sp_who and its undocumented counterpart sp_who2. There is also the pumped up version of these called sp_whoisactive, written by Adam Machanic which does way more than these procedures. I wholly recommend you try it out if you don’t already know how it works. When it comes to serious interrogation of your SQL Server activity then it is absolutely indispensable. Anyway, back to the point of this blog, we are going to look at getting the information from sp_who2 for a remote server. I wrote this Powershell script a week or so ago and was quietly happy with it for a while. I’m relatively new to Powershell so forgive both my rather low threshold for entertainment and the fact that something so simple is a moderate achievement for me. $Server = 'SERVERNAME' $SMOServer = New-Object Microsoft.SQLServer.Management.SMO.Server $Server # connection and query stuff         $ConnectionStr = "Server=$Server;Database=Master;Integrated Security=True" $Query = "EXEC sp_who2" $Connection = new-object system.Data.SQLClient.SQLConnection $Table = new-object "System.Data.DataTable" $Connection.connectionstring = $ConnectionStr try{ $Connection.open() $Command = $Connection.CreateCommand() $Command.commandtext = $Query $result = $Command.ExecuteReader() $Table.Load($result) } catch{ # Show error $error[0] | format-list -Force } $Title = "Data access processes (" + $Table.Rows.Count + ")" $Table | Out-GridView -Title $Title $Connection.close() So this is pretty straightforward, create an SMO object that represents our chosen server, define a connection to the database and a table object for the results when we get them, execute our query over the connection, load the results into our table object and then, if everything is error free display these results to the PowerShell grid viewer. The query simply gets the results of ‘EXEC sp_who2′ for us. Depending on how many connections there are will influence how long the query runs. The grid viewer lets me sort and search the results so it can be a pretty handy way to locate troublesome connections. Like I say, I was quite pleased with this, it seems a pretty simple script and was working well for me, I have added a few parameters to control the output and give me more specific details but then I see a script that uses the $SMOServer object itself to provide the process information and saves having to define the connection object and query specifications. $Server = 'SERVERNAME' $SMOServer = New-Object Microsoft.SQLServer.Management.SMO.Server $Server $Processes = $SMOServer.EnumProcesses() $Title = "SMO processes (" + $Processes.Rows.Count + ")" $Processes | Out-GridView -Title $Title Create the SMO object of our server and then call the EnumProcesses method to get all the process information from the server. Staggeringly simple! The results are a little different though. Some columns are the same and we can see the same basic information so my first thought was to which runs faster – so that I can get my results more quickly and also so that I place less stress on my server(s). PowerShell comes with a great way of testing this – the Measure-Command function. All you have to do is wrap your piece of code in Measure-Command {[your code here]} and it will spit out the time taken to execute the code. So, I placed both of the above methods of getting SQL Server process connections in two Measure-Command wrappers and pressed F5! The Powershell console goes blank for a while as the code is executed internally when Measure-Command is used but the grid viewer windows appear and the console shows this. You can take the output from Measure-Command and format it for easier reading but in a simple comparison like this we can simply cross refer the TotalMilliseconds values from the two result sets to see how the two methods performed. The query execution method (running EXEC sp_who2 ) is the first set of timings and the SMO EnumProcesses is the second. I have run these on a variety of servers and while the results vary from execution to execution I have never seen the SMO version slower than the other. The difference has varied and the time for both has ranged from sub-second as we see above to almost 5 seconds on other systems. This difference, I would suggest is partly due to the cost overhead of having to construct the data connection and so on where as the SMO EnumProcesses method has the connection to the server already in place and just needs to call back the process information. There is also the difference in the data sets to consider. Let’s take a look at what we get and where the two methods differ Query execution method (sp_who2) SMO EnumProcesses Description - Urn What looks like an XML or JSON representation of the server name and the process ID SPID Spid The process ID Status Status The status of the process Login Login The login name of the user executing the command HostName Host The name of the computer where the  process originated BlkBy BlockingSpid The SPID of a process that is blocking this one DBName Database The database that this process is connected to Command Command The type of command that is executing CPUTime Cpu The CPU activity related to this process DiskIO - The Disk IO activity related to this process LastBatch - The time the last batch was executed from this process. ProgramName Program The application that is facilitating the process connection to the SQL Server. SPID1 - In my experience this is always the same value as SPID. REQUESTID - In my experience this is always 0 - Name In my experience this is always the same value as SPID and so could be seen as analogous to SPID1 from sp_who2 - MemUsage An indication of the memory used by this process but I don’t know what it is measured in (bytes, Kb, Mb…) - IsSystem True or False depending on whether the process is internal to the SQL Server instance or has been created by an external connection requesting data. - ExecutionContextID In my experience this is always 0 so could be analogous to REQUESTID from sp_who2. Please note, these are my own very brief descriptions of these columns, detail can be found from MSDN for columns in the sp_who results here http://msdn.microsoft.com/en-GB/library/ms174313.aspx. Where the columns are common then I would use that description, in other cases then the information returned is purely for interpretation by the reader. Rather annoyingly both result sets have useful information that the other doesn’t. sp_who2 returns Disk IO and LastBatch information which is really useful but the SMO processes method give you IsSystem and MemUsage which have their place in fault diagnosis methods too. So which is better? On reflection I think I prefer to use the sp_who2 method primarily but knowing that the SMO Enumprocesses method is there when I need it is really useful and I’m sure I’ll use it regularly. I’m OK with the fact that it is the slower method because Measure-Command has shown me how close it is to the other option and that it really isn’t a large enough margin to matter.

    Read the article

  • Data Generator Source Adapter

    This component needs little explanation. It generates random integer (DT_I4) and string (DT_WSTR) data and places them in the pipeline. You specify how many columns of each you would like and for any string columns you pass a fixed length value. You then need to specify how many rows in total you require to be generated. This component is used by us to do testing of the pipeline and components downstream. Previously we would have used a script component (as a source) to generate the rows but found ourselves rewriting the code too often so created this component. Screenshots SQL Server 2005 Integration Services SQL Server 2008/2012 Integration Services The component is provided as an MSI file, however to complete the installation, you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Data Generator Source from the list. Downloads The Data Generator Source Adapter is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Data Generator Source Adapter for SQL Server 2005 Data Generator Source Adapter for SQL Server 2008 Data Generator Source Adapter for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.30 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.29 - SQL Server 2008 February 2008 CTP. Includes support for upgrade of 2005 packages. Simplified user interface. (4 Mar 2008) Version 2.0.0.27 - SQL Server 2008 November 2007 CTP. String columns will now use the default system code page. Previously string columns always used 1252. (15 Feb 2008) SQL Server 2005 Version 1.1.0.23 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. (12 Jun 2006) Version 1.0.0.0 - SQL Server 2005 IDW 16 Sept CTP. Public release. (6 Oct 2005)

    Read the article

  • The Best Articles for Backing Up and Syncing Your Data

    - by Lori Kaufman
    World Backup Day is March 31st and we decided to provide you with some useful information to make backing up your data easier. We’ve published articles about backing up various types of data and settings both offline and online. There’s all kinds of settings on your computer to backup in addition to your personal data, such as Wi-Fi passwords, drivers, and settings for programs like web browsers, Office, and Windows Live Writer. There are also many tools available to help you keep your data and settings backed up. Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • How to setup tomcat 7 as a server on windows 2008 server

    - by birdy
    I setup the tomcat7 as a service as below: c:\Program Files (x86)\Tomcat7\apache-tomcat-7.0.32\bin>service.bat install Installing the service 'Tomcat7' ... Using CATALINA_HOME: "C:\Program Files (x86)\Tomcat7\apache-tomcat-7.0.32" Using CATALINA_BASE: "C:\Program Files (x86)\Tomcat7\apache-tomcat-7.0.32" Using JAVA_HOME: "C:\Program Files (x86)\Java\jdk1.7.0_09" Using JRE_HOME: "C:\Program Files (x86)\Java\jdk1.7.0_09\jre" Using JVM: "C:\Program Files (x86)\Java\jdk1.7.0_09\jre\bin\server\ jvm.dll" However, when I try to start the service, I Get the error below: c:\Program Files (x86)\Tomcat7\apache-tomcat-7.0.32\bin>tomcat7.exe %1 is not a valid Win32 application. Failed to run service as console application This is the file I downloaded from apache: apache-tomcat-7.0.32-windows-x64.zip. I am able to successfully start tomcat on port 8080 as a standalong thing. Meaning I go to command prompt and type startup.bat and it starts up successfully. Question How can I resolve this and what are the things I should be troubleshooting for?

    Read the article

  • Run a .sql script file in C#

    - by SAMIR BHOGAYTA
    using System.Data.SqlClient; using System.IO; using Microsoft.SqlServer.Management.Common; using Microsoft.SqlServer.Management.Smo; namespace ConsoleApplication1 { class Program { static void Main(string[] args) { string sqlConnectionString = "Data Source=(local);Initial Catalog=AdventureWorks;Integrated Security=True"; FileInfo file = new FileInfo("C:\\myscript.sql"); string script = file.OpenText().ReadToEnd(); SqlConnection conn = new SqlConnection(sqlConnectionString); Server server = new Server(new ServerConnection(conn)); server.ConnectionContext.ExecuteNonQuery(script); } } }

    Read the article

  • Update: TFS Power Tools March 2011

    - by Enrique Lima
    There is an update available for the TFS Power Tools and the TFS Build Power Tools. Among the updates to the Tools: Changes to the Team Foundation Server Backups Add-In for TFS Admin Console. Added functionality to the Windows Shell Extension. Changes to the tfpt command line tool that allows you to script build management commands. For a full detail of the changes, read Brian Harry’s post  http://blogs.msdn.com/b/bharry/archive/2011/03/03/mar-11-team-foundation-server-power-tools-are-available.aspx To download the Power Tools: Team Foundation Server Power Tools Team Foundation Server Build Extensions Power Tool

    Read the article

  • What tools should I consider if my strategy is to make a game available to as many platforms as possible?

    - by Kenji Kina
    We're planning on developing a 2D, grid-based puzzle game, and although it's still very early in the planning stages, we'd like to make our decisions well from the beginning. Our strategy will be to make the game available to as many platforms as possible, for example PCs (Windows, Mac and/or Linux), mobile phones (iPhone and/or Android based phones), game consoles (XBLA and/or PSN) PC will have an emphasis, but I believe that's the most flexible platform so that shouldn't be a problem. So, what programming language, game engine, frameworks and all around tools would be best suited for our goal? P.S.: I'm betting a set of tools won't cover ALL of them, and that there will still be some kind of "translating" effort for some platforms, but we'd like to know what the most far reaching are.

    Read the article

  • High Performance SQL Views Using WITH(NOLOCK)

    - by gt0084e1
    Every now and then you find a simple way to make everything much faster. We often find customers creating data warehouses or OLAP cubes even though they have a relatively small amount of data (a few gigs) compared to their server memory. If you have more server memory than the size of your database or working set, nearly any aggregate query should run in a second or less. In some situations there may be high traffic on from the transactional application and SQL server may wait for several other queries to run before giving you your results. The purpose of this is make sure you don’t get two versions of the truth. In an ATM system, you want to give the bank balance after the withdrawal, not before or you may get a very unhappy customer. So by default databases are rightly very conservative about this kind of thing. Unfortunately this split-second precision comes at a cost. The performance of the query may not be acceptable by today’s standards because the database has to maintain locks on the server. Fortunately, SQL Server gives you a simple way to ask for the current version of the data without the pending transactions. To better facilitate reporting, you can create a view that includes these directives. CREATE VIEW CategoriesAndProducts AS SELECT * FROM dbo.Categories WITH(NOLOCK) INNER JOIN dbo.Products WITH(NOLOCK) ON dbo.Categories.CategoryID = dbo.Products.CategoryID In some cases quires that are taking minutes end up taking seconds. Much easier than moving the data to a separate database and it’s still pretty much real time give or take a few milliseconds. You’ve been warned not to use this for bank balances though. More from Data Stream

    Read the article

  • how to recover deleted ntfs patition with data entirely while installing ubuntu 13.04

    - by Anson Varghese
    I've installed ubuntu 13.04 onto my hp 2231tx computer. During installation all of my data was erased. I didn't know all of my three partitions would be deleted. I was shocked after finding out that all of my personal data was erased. I didn't know what to do to resolve this problem so I search google for an answer. I found a program called testdisk and I used it to recover about half of my data. Among this data weren't my personal photos and videos. Is there a way to recover the other half?

    Read the article

  • TSQL formatting - a sure fire way to start a conversation.

    - by fatherjack
    There are probably as many opinions on ways to format code as there are people writing code and I am not here to say that any one is better than any other. Well, that isn't true. I am here to say that one way is better than another but this isn't a matter of preference or personal taste, this is an example of where sloppy formatting can cause TSQL to weird and whacky things but following some simple methods can make your code more reliable and more robust when . Take these two pieces of code, ready...(read more)

    Read the article

  • Data Loading Issues? Try the new Demantra Data Load Guided Resolution

    - by user702295
    Hello!   Do you have data loading issues?  Perhaps you are trying the new partial schema export tool.   New to Demantra, the Data Load Guided Resolution, document 1461899.1.  This interactive guide will help you locate known solutions to previously discovered issues quickly.  From performance, ORA and ODPM errors to collections related issues that have no known hard number error.   This guide includes the diagnosis of data being imported into Demantra and data being exported from Demantra.  Contact me with any questions or suggestions.   Thank You!

    Read the article

  • SQL Server tempdb size seems large, is this normal?

    - by Abe Miessler
    From what I understand the system database is used to hold temporary tables, intermediate results and other temporary information. On one of my database instances I have a tempdb that is seems very large (30GB). This database has not been modified (as in "last modified date" on the mdf file) in over a week. Is it normal to have the temp db remain that large for that long of a period? It seems to me that it should be updating fairly often and returning space that it is using fairly quickly... Am I way off here or is SQL Server doing something weird? FYI: This is a SharePoint 2010 database, not sure if that makes a difference.

    Read the article

  • Distortion in format of data in wordpad file when shifted from windows XP to winows 2007

    - by Harpreet
    I have many data files which were set to open in wordpad file in windows XP. Those files have a particular format for data, like following: Name of Data file No. of data columns Name of data in column_1 Name of data in column_2 . . . Name of data in column_n column_1 column_2 column_3 ... column_n Now my computer has been formatted and OS is changed to windows 2007, however when I open my data files in wordpad the above format of data is no more present. The format in wordpad in windows 2007 seems to be distorted. Does anyone knows what to do to restore the format as shown above, which is what the data used to look like in XP? I have attached the snap shot of the new distorted format of data as seen in wordpad in windows 2007. The snap shot shows 100 column names, however the data columns present are only 5 when it should be actually 100 data columns.

    Read the article

  • Transferring DHCP using Windows Server Migration Tool - Why is Powershell is crashing on the import of the .mig file?

    - by Mike
    I am migrating DHCP from a windows server 2003R2 DC to a Windows Server 2008R2 DC I've followed this video and its predecessor (Installing Windows Server Migration Tools) http://technet.microsoft.com/en-us/video/migrating-dhcp-using-the-windows-server-2008-r2-migration-tools.aspx I went through everything smoothly until the last step. I have exported a .mig file with my DHCP configuration on the old 2003r2 server. I transferred this .mig file over to my 2008R2 server, when running the import command, it will appear to work for a minute or two and then I get a generic windows "Powershell has stopped working" error and I have to close the program. Under the problem details I see the following: FileVersionOfSystemManagementAutomation: 6.1.7600.16385 InnermostExceptionType: System.AccessViolationException OutermostExceptionType: System.AccessViolationException DeepestPowerShellFrame: unknown OS Version: 6.1.7600.2.0.0.272.7 LocaleID: 1033 Seems like there are permissions issues maybe? I am running powershell as an admin and am logged in to the server as a domain administrator. Any Ideas? Thanks

    Read the article

  • TDD with SQL and data manipulation functions

    - by Xophmeister
    While I'm a professional programmer, I've never been formally trained in software engineering. As I'm frequently visiting here and SO, I've noticed a trend for writing unit tests whenever possible and, as my software gets more complex and sophisticated, I see automated testing as a good idea in aiding debugging. However, most of my work involves writing complex SQL and then processing the output in some way. How would you write a test to ensure your SQL was returning the correct data, for example? Then, say if the data wasn't under your control (e.g., that of a 3rd party system), how can you efficiently test your processing routines without having to hand write reams of dummy data? The best solution I can think of is making views of the data that, together, cover most cases. I can then join those views with my SQL to see if it's returning the correct records and manually process the views to see if my functions, etc. are doing what they're supposed to. Still, it seems excessive and flakey; particularly finding data to test against...

    Read the article

  • Process Improvement and the Data Professional

    - by BuckWoody
    Don’t be afraid of that title – I’m not talking about Six Sigma or anything super-formal here. In many organizations, there are more folks in other IT roles than in the Data Professional area. In other words, there are more developers, system administrators and so on than there are the “DBA” role. That means we often have more to do than the time we need to do it. And, oddly enough, the first thing that is sacrificed is process improvement – the little things we need to do to make the day go faster in the first place. Then we get even more behind, the work piles up and…well, you know all about that. Earlier I challenged you to find 10-30 minutes a day to study. Some folks wrote back and asked “where do I start”? Well, why not be super-efficient and combine that time with learning how to make yourself more efficient? Try out a new scripting language, learn a new tool that automates things or find out ways others have automated their systems. In general, find out what you’re doing and how, and then see if that can be improved. It’s kind of like doing a performance tuning gig on yourself! If you’re pressed for time, look for bite-sized articles (like the ones I’ve done here for PowerShell and SQL Server) that you can follow in a “serial” fashion. In a short time you’ll have a new set of knowledge you can use to make your day faster. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • First Shard for SQL Azure and SQL Server

    - by Herve Roggero
    That's it!!!!! It's ready to go and be tested, abused and improved! It requires .NET 4.0 and uses some cool technologies, like caching (the new System.Runtime.Caching) and the Task Parallel Library (System.Threading.Tasks). With this library you can: Define a shard of 1, 2 or 100 SQL databases (a mix of SQL Server and SQL Azure) Read from the shard in parallel or sequentially, and cache resultsets Update, Delete a record from the shard Insert records quickly in the shard with a round-robin load Reset the cache You can download the source code and a sample application here: http://enzosqlshard.codeplex.com/  Note about the breadcrumbs: I had to add a connection GUID in order for the library to know which database a record came from. The GUID is currently calculated on the fly in the library using some of the parameters of the connection string. The GUID is also dynamically added to the result set so the client can pass it back to the library. I am curious to get your feedback on this approach. ** Correction from my previous post: this is a library for a Horizontal Partition Shard (HPS): tables are split across databases horizontally. So in essence, the tables need to have the same schema across the databases.

    Read the article

  • SQL Server 2012 and SQLMail - will it still work?

    - by Kharlos Dominguez
    We are considering upgrading our SQL Server, which is currently running 2005. We use SQLMail heavily in the organization, both to send e-mails and to import some into a database. I've read on various places that SQLMail was deprecated and superseded by "Database Mail". I'm confused because this MS page: http://msdn.microsoft.com/en-us/library/bb402904.aspx seems to imply that it would still work? I understand the dangers of SQLMail but we do not have the resources to rewrite the scripts right now and would prefer to do it later on. Does SQLMail still work in 2012, and if not, how easy is it to replace with Database Mail, both for reading and sending e-mails?

    Read the article

  • What tools should I consider if my aim is to make a game available to as many platforms as possible?

    - by Kensai
    We're planning on developing a 2D, grid-based puzzle game, and although it's still very early in the planning stages, we'd like to make our decisions well from the beginning. Our strategy will be to make the game available to as many platforms as possible, for example PCs (Windows, Mac and/or Linux), mobile phones (iPhone and/or Android based phones), game consoles (XBLA and/or PSN) PC will have an emphasis, but I believe that's the most flexible platform so that shouldn't be a problem. So, what programming language, game engine, frameworks and all around tools would be best suited for our goal? P.S.: I'm betting a set of tools won't cover ALL of them, and that there will still be some kind of "translating" effort for some platforms, but we'd like to know what the most far reaching are.

    Read the article

  • How to connect to database on remote server

    - by user137263
    Where there is VPN to remote server and then access to the database via local network interface, how can one establish a remote link between one's computer (with a programme such as Visual Studio 2010) and SQL Server (e.g. 2008 R2) ? Any attempts to create a direct link to the SQL Server are blocked. Whilst the SQL Server can be configured to allow external access, this provides its own host of problems. Any help would be much appreciated.

    Read the article

  • How to use symbolic links in windows server 2008R2 across the network (mklink)

    - by server info
    I have One Server (Srv1) which holds data with file shares and the storage is full. Now I have second Server (Srv2) which has alot more space. No I would like to transfer all the data von Serv1 to Serv2 and have links to the new destination. I found mklink very useful here but unfortunately it does not work over the network. Which also points the docu out. People heavily rely on the path's so it would be helpful if somone has a pointer for me... how to handle symbolic links a cross the network with Windows Servers. I am running Windows Server 2008. Thanks for any help

    Read the article

  • What is the right option of programming languages and tools for building our website?

    - by Goma
    We are 3 persons trying to build a large website which will be available in 3 languges. However, we will start with one language and with small idea then we are going to improve it and make it larger! What do you think the best tools and language that we should use? We are caring alot about the speed of loading the pages and tools that provide excellent qulaity with cheaper fees. Edit: We are graphic designers, so we did not choose the programming language yet. But we studied computer science and we have an idea but we found that this is the best place to ask the question and expect the right answer from you. Should we use ASP.NET for example? or PHP? We do not want an expesive option that will cost us alot in the future and we do not want to change the technology at least for the first 5 years. Thanks!

    Read the article

  • How can I perform sentiment analysis on extracted text from online sources?

    - by aniket69
    I'm working on extracting the sentiment from YouTube comments, blogs, news content, Facebook wall posts, and Twitter feeds. I'm looking for an automated way to do this: the two third-party solutions I've found have been AlchemyAPI and RapidMiner. Are these the best way to approach this project, or should I be using something else? Is there a more efficient way to approach sentiment analysis? What techniques have worked for you in a project like this?

    Read the article

  • Design practice for securing data inside Azure SQL

    - by Sid
    Update: I'm looking for a specific design practice as we try to build-our-own database encryption. Azure SQL doesn't support many of the encryption features found in SQL Server (Table and Column encryption). We need to store some sensitive information that needs to be encrypted and we've rolled our own using AesCryptoServiceProvider to encrypt/decrypt data to/from the database. This solves the immediate issue (no cleartext in db) but poses other problems like Key rotation (we have to roll our own code for this, walking through the db converting old cipher text into new cipher text) metadata mapping of which tables and which columns are encrypted. This is simple when it's just couple of columns (send an email to all devs/document) but that quickly gets out of hand ... So, what is the best practice for doing application level encryption into a database that doesn't support encryption? In particular, what is a good design to solve the above two bullet points? If you had specific schema additions would love it if you could give details ("Have a NVARCHAR(max) column to store the cipher metadata as JSON" or a SQL script/commands). If someone would like to recommend a library, I'd be happy to stay away from "DIY" too. Before going too deep - I assume there isn't any way I can add encryption support to Azure by creating a stored procedure, right?

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >