Search Results

Search found 99645 results on 3986 pages for 'sql server 2005'.

Page 180/3986 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • Design practice for securing data inside Azure SQL

    - by Sid
    Update: I'm looking for a specific design practice as we try to build-our-own database encryption. Azure SQL doesn't support many of the encryption features found in SQL Server (Table and Column encryption). We need to store some sensitive information that needs to be encrypted and we've rolled our own using AesCryptoServiceProvider to encrypt/decrypt data to/from the database. This solves the immediate issue (no cleartext in db) but poses other problems like Key rotation (we have to roll our own code for this, walking through the db converting old cipher text into new cipher text) metadata mapping of which tables and which columns are encrypted. This is simple when it's just couple of columns (send an email to all devs/document) but that quickly gets out of hand ... So, what is the best practice for doing application level encryption into a database that doesn't support encryption? In particular, what is a good design to solve the above two bullet points? If you had specific schema additions would love it if you could give details ("Have a NVARCHAR(max) column to store the cipher metadata as JSON" or a SQL script/commands). If someone would like to recommend a library, I'd be happy to stay away from "DIY" too. Before going too deep - I assume there isn't any way I can add encryption support to Azure by creating a stored procedure, right?

    Read the article

  • Installing SQL Server 2012 on Windows 2012 Server

    - by andyleonard
    In Want to Learn SQL Server 2012? I wrote about obtaining a fully-featured version of SQL Server 2012 (Developer Edition). This post represents one way to install SQL Server 2012 Developer Edition on a Hyper-V virtual machine running the Windows 2012 Server Standard Edition operating system. This is by no means exhaustive. My goal in writing this is to help you get a default instance of SQL Server 2012 up and running. I do not cover setting up the Hyper-V virtual machine. I begin after loading the...(read more)

    Read the article

  • How do I restore a SQL Server database from last night's full backup and the active transaction log file?

    - by Dylan Beattie
    I have been told that it's good practise to keep your SQL Server data files and log files on physically separate disks, because it'll allow you to recover your data to the point of failure if the data drive fails. So... let's say that mydata.mdf is on drive D:, and my mydata_log.ldf is on drive E:, and it's 16:45, and drive D: has just died horribly. So - I have last night's full backup (mydata.bak). I have hourly transaction-log backups that will bring the data back up to 16:00... but that means I'll lose 45 minutes worth of updates. I still have mydata_log.ldf on the E: drive, which should contain EVERY transaction that was committed right up to the point where the drive failed. How do I go about recreating the database and restoring data from the backup file and the live transaction log, so I don't lose any updates? Is this possible?

    Read the article

  • Aspnet_merge error has no detail

    - by dang57
    I have been attempting to add a Deployment Project to my web app. When I build it, I get a message "An error occurred when merging assemblies: Exception from HRESULT: 0x806D0004". There is no other detail, like ILMerge error, or Duplicate Name. I have "verbosity" set to "Diagnostic", and this is the output: Command: C:\Program Files\MSBuild\Microsoft\WebDeployment\v8.0\aspnet_merge.exe "\...XXX...\My Documents\Visual Studio 2005\Projects\XXX_deploy\Debug" -o XXX_deploy -debug -copyattrs The "AspNetMerge" task is using "aspnet_merge.exe" from "C:\Program Files\MSBuild\Microsoft\WebDeployment\v8.0\aspnet_merge.exe". Utility to merge precompiled ASP.NET assemblies. An error occurred when merging assemblies: Exception from HRESULT: 0x806D0004 C:\Program Files\MSBuild\Microsoft\WebDeployment\v8.0\Microsoft.WebDeployment.targets(474,9): error MSB6006: "aspnet_merge.exe" exited with code 1. Done executing task "AspNetMerge" -- FAILED. Done building target "AspNetMerge" in project "XXX_deploy.wdproj" -- FAILED. Done building project "XXX_deploy.wdproj" -- FAILED. Build FAILED. I have tried running the command via the Command prompt, but it does not give any additional information. I have also removed EVERYTHING from the project, including references, style sheets, forms, tableadapters. I still have a web.config, but deleted all app-specific lines. I added a single new form named Default. I have even tried renaming that form to DefaultX, just in case there was another Default out there. I still get the error. What else can I look for? I'm running VS 2005 v8.05. Thanks Dan

    Read the article

  • Why is Reporting Services report vastly slower than its query?

    - by Telos
    I have a query that takes roughly 2 minutes to run. It's not terribly complex in terms of parameters or anything, and the report itself doesn't do any truly extensive processing. Basically just spits the data straight out in a nice format. (Actually one of the reports doesn't format the data at all, just returns a flat table meant to be manipulated in excel.) It's not returning a massive set of data either. Yet the report takes upwards of 30 minutes to run. What could cause this? This is SSRS 2005 against a SQL 2005 database btw. EDIT: OK, I found that with the addition of WITH (NOLOCK) in the report it takes the same time as the query does through SSMS. Why would the query be handled differently if it's coming from reporting services (or visual studio on my local machine) than if coming from SSMS on my local machine? I saw the query running in Activity Monitor a couple times in SLEEP_WAIT mode, but not blocked by anything... EDIT2: The connection string is: Data Source=SERVERNAME;Initial Catalog=DBName

    Read the article

  • How to avoid chaotic ASP.NET web application deployment?

    - by emzero
    Ok, so here's the thing. I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code. So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it. We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets. I don't have idea how to put the pieces together I would like to: Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed. Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere. Copy the site files and executing the generated sql script in an easy and automated way. I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy. If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option. I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy. Any help will be really appreciated, Thank you all, Regards

    Read the article

  • Import IIS log into SQL Server 2008 error

    - by Vivek Chandraprakash
    I'm trying to import IIS logs into SQL Server 2008. I get this error below. Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "cs(User-Agent)" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard) I tried changing the column width of user agent to varchar(8000) and nvarchar(4000) no luck. pls help -Vivek

    Read the article

  • Offline backup synchronization

    - by Pavan Kumar
    There is a Central Server running Windows Server 2003 and SQL Server 2005 and there are 7 client machines situated in various places and has XP Pro & SQL Server 2005 installed in all of them. They are not interconnected so they are physically seperate. One person goes to each of these centers maybe twice a month and takes the backup (Full database consisting of mdf and ldf files) with a pen drive and brings it to the Central server which contains the central database holding same schema as all the other client databases. I need to synchronize each backup database (belonging to different centers) one by one to update the existing data or inserting new data in the central database . The solution i got was Replication. The pendrive is brought to central server consisting of 7 instances of the databases and then the databases is attached to the central server one by one to the same SQL Server where the central database exists. Then my idea was to replicate the backup database one by one i.e using single subscription (Central Database) and multiple publication ( i.e 7 instances of databases in my case) toplogy by performing replication locally (i.e in the same machine). So i tried to develop a UI in C# .Net to programatically run the Transactional Replication with push subscription using RMO Programming (which is incomplete as of now because there is no point in developing when you already know it is not the solution). Transactional Replication can either be set to initialize with a snapshot or without a snapshot. If i go for the first option i.e with a snapshot , the data whatever is present in Central Database is overwritten by the new data . So the data present initially in the central database is lost. If i try to initialize without snapshot , no data (the data already has the updated and new data) will be sent from the backup database to server. The replication will work in a scenario where any incremental changes is done only after you set the replication . So the initial data whatever was present in the backup database when setting up the replication will not be replicated when running the snapshot agent for the first time to synchronize. Only changes in the backup database thereafter will be reflected to the central database .(Remember I am not going to insert new data or make any changes to the backup database after i attach it to the Central Server. ) So this solution is not feasible. I want a solution for synchronizing from one client database to central database present in the same machine using C#.NET. If you can provide me small example maybe with two databases(with same schema) DB1(Client) to DB2(Server) consisting of one or two tables it will be very helpful. The synchronization is not bidirectional.I want to only update existing data or insert new data from DB1 to DB2 (DB2 may contain some data initially). Thanks and Regards Pavan

    Read the article

  • SQL Server 2008 Management Studio doesn't recognize new Schema

    - by Lieven Cardoen
    I have created a new Schema in a database called Contexts. Now when I want to write a query, Management Studio doesn't recognize the tables that belong to the new Schema. It says: 'Invalid object name Contexts.ContextLibraries'... Transact-SQL: INSERT INTO [Contexts].[ContextLibraries] (ChannelId, [IsSystem]) VALUES (@ChannelId, 1) When I try the same thing on my local database, it does work... Any ideas? I did try to change the Default schema for the user from dbo to Contexts but this doesn't work. Also checked Contexts in Schemas owned by this user without success. Update: Apparently the sql query does work but the editor gives a fault saying the object is invalid.

    Read the article

  • sqlserver 2008 and sql CE over Microsoft sync framework

    - by malik
    The server will drop the connection, because the client driver has sent multiple requests while the session is in single-user mode. This error occurs when a client sends a request to reset the connection while there are batches still running in the session, or when the client sends a request while the session is resetting a connection. Please contact the client driver vendor. Synchronisation process some time work and mostly fails. I have 5 client of sql CE that need to sync and i am using WCF IIS and sql 2008 for this process.

    Read the article

  • Can you update/add records in SQL using a datagridview and LINQ to SQL

    - by Jordan S
    Is it possible to bind a DataGridView to a LINQ to SQL class so that when I make changes to the records in the datagridview it automatically updates the SQL database? I have tried binding the data like this but if I make changes to the data in the datagrid view they do not actually affect the data in the database... BOMClassesDataContext DB = new BOMClassesDataContext(); var mfrs = from m in DB.Manufacturers select m; BindingSource bs = new BindingSource(); bs.DataSource = mfrs; dataGridView1.DataSource = bs;

    Read the article

  • Selecting the most common value from relation - SQL statement

    - by Ronnie
    I have a table within my database that has many records, some records share the same value for one of the columns. e.g. | id | name | software | ______________________________ | 1 | john | photoshop | | 2 | paul | photoshop | | 3 | gary | textmate | | 4 | ade | fireworks | | 5 | fred | textmate | | 6 | bob | photoshop | I would like to return the value of the most common occurring piece of software, by using an SQL statement. So in the example above the required SQL statement would return 'photoshop' as it occurs more than any other piece of software. Is this possible? Thank you for your time.

    Read the article

  • Very uneven CPU utilization with SQL Server 2012 on 2 processor computer with 16 cores / processor

    - by cooplarsh
    After installing SQL Server Enterprise 2012 with the Server + Cal license model, on a computer with 2 processors each with 16 cores (and no hyperthreading involved) and putting the server under extremely heavy load the 16 cores on the first processor were very underutilized, the first 4 cores on the 2nd CPU were heavily utilized, and the last 12 cores were not used at all (because of the 20 core limit for this sql server version). Total CPU utilization was displaying as around 25%. Unfortunately, the server suffered from extremely poor performance even though if the tasks were evenly distributed across the 20 cores it wouldn't have been anywhere near as bad. The Windows Server was running on a VMWare virtual image under ESX Server, but all of the CPU was allocated to the windows server. We tried changing affinity settings (e.g., allocating most cores to CPU and the others to I/O), but that didn't help solve the performance problems. Upgrading the product edition to SQL Server Enterprise Core 2012 not only allowed the SQL Server to utilize the 12 previously unused cores on the 2nd processor, but it also resulted in a much more even distribution of tasks across all of the processors. To get through the backlog of requests cpU utilization jumped to around 90%, and then came down to around 33% once it was caught up, but performance improved dramatically since we failed over to the newly updated version And the performance issues went away. I was wondering if anyone knows what might cause SQL Server to unevenly distribute the load, relying almost exclusively on the first 4 cores of the 2nd processor that had 12 cores idle, and allocate only a few tasks to each of the 16 cores on the first processor. Also, is there any way we could have more evenly distributed the load across the 20 cores that were being used without the product edition upgrade? The flip side of that question is what did the product upgrade do that caused SQL Server to start evenly distributing the load across all of the cores that it recognized? Thanks to any insight to answer these questions and/or links that might help me better understand how to make sense of what was happenings.

    Read the article

  • SQL Server - Query Short-Circuiting?

    - by Sam Schutte
    Do T-SQL queries in SQL Server support short-circuiting? For instance, I have a situation where I have two database and I'm comparing data between the two tables to match and copy some info across. In one table, the "ID" field will always have leading zeros (such as "000000001234"), and in the other table, the ID field may or may not have leading zeros (might be "000000001234" or "1234"). So my query to match the two is something like: select * from table1 where table1.ID LIKE '%1234' To speed things up, I'm thinking of adding an OR before the like that just says: table1.ID = table2.ID to handle the case where both ID's have the padded zeros and are equal. Will doing so speed up the query by matching items on the "=" and not evaluating the LIKE for every single row (will it short circuit and skip the LIKE)?

    Read the article

  • INNER JOIN vs LEFT JOIN performance in SQL Server

    - by Ekkapop
    I've created SQL command that use INNER JOIN for 9 tables, anyway this command take a very long time (more than five minutes). So my folk suggest me to change INNER JOIN to LEFT JOIN because the performance of LEFT JOIN is better, at first time its despite what I know. After I changed, the speed of query is significantly improve. I want to know why LEFT JOIN is faster than INNER JOIN? My SQL command look like below: SELECT * FROM A INNER JOIN B ON ... INNER JOIN C ON ... INNER JOIN D and so no

    Read the article

  • Getting clusters of rows close together in time

    - by Mike
    I have a table basically like so ID | ItemID | Start | End | --------------------------------------------------------------- 1 234 10/20/09 8:34:22 10/20/09 8:35:10 2 274 10/20/09 8:35:30 10/20/09 8:36:27 3 272 10/21/09 12:15:00 10/21/09 12:17:00 4 112 10/21/09 12:20:14 10/21/09 12:21:21 5 15 10/21/09 12:22:39 10/21/09 12:24:15 There are two "clusters" of entries here, 1-2 and 3-5 separated by a gap in time, specifically 30 minutes is what I'm interested in. What I would like is the first and last rows of the cluster of entries. This is fairly easy to achieve by retrieving all the rows and looping through them in order of start time, but I'd like to have it in SQL if possible. I'm using SQL Server 2008, thanks.

    Read the article

  • Generate SQL Server Express database from Entity Framework 4 model

    - by Cranialsurge
    I am able to auto-generate a SQL Server CE 4.0 *.sdf file using code-first generation as explained by Scott Guthrie here. The connection string for the same is as follows: <add name="NerdDinners" providerName="System.Data.SqlServerCe.4.0" connectionString="data source=|DataDirectory|NerdDinner.sdf"/> However if I try to generate an mdf instead using the following connection string, it fails to do so with the following error - "The provider did not return a ProviderManifestToken string.". <add name="NerdDinners" providerName="System.Data.SqlClient" connectionString="data source=|DataDirectory|NerdDinner.mdf"/> Even directly hooking into a SQLEXPRESS instance using the following connection string fails <add name="NerdDinners" providerName="System.Data.SqlClient" connectionString="Data Source=.\SQLEXPRESS;Initial Catalog=NerdDinner;Integrated Security=True"/> Does EF 4 only support SQL CE 4.0 for database creation from a model for now or am I doing something wrong here?

    Read the article

  • Help Reading Binary Image Data from SQL Server into PHP

    - by Joe Majewski
    I cannot seem to figure out a way to read binary data from SQL server into PHP. I am working on a project where I need to be able to store the image directly in the SQL table, not on the file system. Currently, I have been using a query like this one: INSERT INTO myTable(Document) SELECT * FROM OPENROWSET(BULK N'C:\image.jpg', SINGLE_BLOB) as BLAH This works fine to actually insert the image into the table, but I haven't yet figured a way to retrieve it and get my image back. I am doing this with PHP, and ultimately will have to make a stored procedure out of it, but can anyone enlighten me on a way to get that binary data (varbinary(MAX)) and generate an image on the fly. I expected it to be simple to use a SELECT statement and add a content-type to the headers that indicated it was an image, but it's simply not working. Instead, the page will just display the name of the file, which I have encountered in the past and understand it to be an error with the image data.

    Read the article

  • Correct Way to Get Date Between Dates In SQL Server

    - by Chuck Haines
    I have a table in SQL server which has a DATETIME field called Date_Printed. I am trying to get all records in the table which lie between a specified date range. Currently I am using the following SQL DECLARE @StartDate DATETIME DECLARE @EndDate DATETIME SET @StartDate = '2010-01-01' SET @EndDate = '2010-06-18 12:59:59 PM' SELECT * FROM table WHERE Date_Printed BETWEEN @StartDate AND @EndDate I have an index on the Date_Printed column. I was wondering if this is the best way to get the rows in the table which lie between those date or if there is a faster way. The table has about 750,000 records in it right now and it will continue to grow. The query is pretty fast but I'd like to make it faster if possible.

    Read the article

  • Using multiple aggregate functions in an algebraic expression in (ANSI) SQL statement

    - by morpheous
    I have the following aggregate functions (AGG FUNCs): foo(), foobar(), fredstats(), barneystats(). I want to know if I can use multiple AGG FUNCs in an algebraic expression. This may seem a strange/simplistic question for seasoned SQL developers - however, the but the reason I ask is that so far, all AGG FUNCs examples I have seen are of the simplistic variety e.g. max(salary) < 100, rather than using the AGG FUNCs in an expression which involves using multiple AGG FUNCs in an expression (like agg_func1() agg_func2()). The information below should help clarify further. Given tables with the following schemas: CREATE TABLE item (id int, length float, weight float); CREATE TABLE item_info (item_id, name varchar(32)); # Is it legal (ANSI) SQL to write queries of this format ? SELECT id, name, foo, foobar, fredstats FROM A, B (SELECT id, foo(123) as foo, foobar('red') as foobar, fredstats('weight') as fredstats FROM item GROUP BY id HAVING [ALGEBRAIC EXPRESSION] ORDER BY id AS A), item_info AS B WHERE item.id = B.id Where: ALGEBRAIC EXPRESSION is the type of expression that can be used in a WHERE clause - for example: ((foo(x) < foobar(y)) AND foobar(y) IN (1,2,3)) OR (fredstats(x) <> 0)) I am using PostgreSQL as the db, but I would prefer to use ANSI SQL wherever possible. Assuming it is legal to include AGG FUNCS in the way I have done above, I'd like to know: Is there a more efficient way to write the above query ? Is there any way I can speed up the query in terms of a judicious choice of indexes on the tables item and item_info ? Is there a performance hit of using AGG FUNCs in an algebraic expression like I am (i.e. an expression involving the output of aggregate functions rather than constants? Can the expression also include 'scaled' AGG FUNC? (for example: 2*foo(123) < -3*foobar(456) ) - will scaling (i.e. multiplying an AGG FUNC by a number have an effect on performance?) How can I write the query above using INNER JOINS instead?

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >