Search Results

Search found 102921 results on 4117 pages for 'sql server integration se'.

Page 84/4117 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Presenting at the San Francisco SQL Server User Group - 12-Sep-2012

    - by RickHeiges
    I have a business trip scheduled out far enough in advance for a change. I was able to schedule a presentation at the San Francisco SQL Server User Group on Sep 12 about SQL Server Consolidation Strategies. If you will be in the SF area on Sep 12, I invite you to attend ar just drop by to say hello. You can find out more about the group at http://www.meetup.com/The-San-Francisco-SQL-Server-Meetup-Group/ Hope to see you there!...(read more)

    Read the article

  • Move a SQL Azure server between subscriptions

    - by jamiet
    In September 2011 I published a blog post SSIS Reporting Pack v0.2 now available in which I made available the credentials of a sample database that one could use to test SSIS Reporting Pack. That database was sitting on a paid-for Azure subscription and hence was costing me about £5 a month - not a huge amount but when I later got a free Azure subscription through my MSDN Subscription in January 2012 it made sense to migrate the database onto that subscription. Since then I have been endeavouring to make that move but a few failed attempts combined with lack of time meant that I had not yet gotten round to it.That is until this morning when I heard about a new feature available in the Azure Management Portal that enables one to move a SQL Azure server from one subscription to another. Up to now I had been attempting to use a combination of SSIS packages and/or scripts to move the data but, as I alluded, I ran into a few roadblocks hence the ability to move a SQL Azure server was a godsend to me. I fired up the Azure Management Portal and a few clicks later my server had been successfully migrated, moreover the name of the server doesn't change and neither do any credentials so I have no need to go and update my original blog post either. Its easy to be cynical about SQL Azure (and I maintain a healthy scepticism myself) but that, my friends, is cool!You can read more about the ability to move SQL Azure servers between subscriptions from the official blog post Moving SQL Azure Servers Between Subscriptions.@Jamiet

    Read the article

  • R2 and Idera Idera SQL Safe (Freeware Edition)

    - by DavidWimbush
    Good news: the Freeware edition of Idera SQL Safe works on R2. You might not care but I certainly do. Here's why:  In September last year I started using Idera SQL Safe (the Freeware Edition) to get backup compression on my SQL 2005 servers. It seemed like a good idea at the time - it was free and my backups ran much faster and took up much less disk space. I really thought I'd actually scored a free lunch. Until they discontinued the product. I was thinking about what to do when I heard that R2 Standard would include native backup compression so I've just been keeping my fingers crossed since then. So I installed R2 Developer on my laptop, installed SQL Safe and kicked off a restore with it. No problem. Phew! Now I won't have to do a special, non-compressed backup and restore when we migrate.

    Read the article

  • Oracle Announces Leading ISV Integration With Oracle Sales and Marketing Cloud Service

    - by Richard Lefebvre
    More Than 100 ISVs, including Big Machines, Marketo and Xactly, now Provide Integrated Offerings to Help Maximize Sales and Single Customer Viewpoint Demonstrating its continued commitment to business value via open standards and the cloud, Oracle today announced that more than 100 leading ISVs are integrating in the cloud with Oracle Sales and Marketing Cloud Service, a service available through Oracle Cloud. For the first time Oracle Sales and Marketing Cloud Service users can choose from a wide array of directly integrated third-party solutions, providing a new level of choice, seamless deployment and single view of customers with preferred implementations. Top partners, including ActivePrime, Avaya, BigMachines, Box, Brainshark, Callidus Software, CirrusPath, Clicktools, CRMIT, DBSync, EchoSign from Adobe, Eloqua, Fliptop, FPX, HarQen, HubSpot, iHance, InsideSales.com, InsideView, Interactive Intelligence, Lingotek, LinkPoint360, Marketo, Nuance, PerspecSys, Postcode Anywhere, Revegy, salesElement, StrikeIron, upsourceIT, White Springs, X+1 and Xactly, have announced their availability and integration today. By integrating with Oracle Sales and Marketing Cloud Service, ISV solutions can easily be leveraged by customersBy choosing Oracle Sales and Marketing Cloud Service as a sales platform, customers will continue to have complete choice of their own quoting, lead management and sales methodology solutions and it will all be pre-integrated with Oracle Sales and Marketing Cloud Service. With demonstrable integration fusing standards-based technologies, such as SOAP web services, Oracle Sales and Marketing Cloud Service customers choosing ISV integrations will also benefit from familiar ease-of-use and the Oracle Sales and Marketing Cloud ervice user interface, including buttons, links and custom objects for a rich user experience. ISV integration with Oracle Sales and Marketing Cloud Service also enables on-demand contextual data exchange capabilities, linking Oracle Sales and Marketing Cloud Service business data with third-party application data for a complete CRM view. ISVs building robust, repeatable integrations with Oracle Sales and Marketing Cloud Service can begin the process of achieving Oracle Validated Integration, an Oracle PartnerNetwork program that recognizes Oracle partner solutions with proven integration to Oracle Applications. ISVs can learn more about Oracle Validated Integration    here. For customers, Oracle Validated Integration means that a partner’s integration has been tested and validated as functionally and technically sound, that the partner solution is integrated with Oracle Sales and Marketing Cloud Service in a reliable, standardized way, and that the integration operates and performs as documented. Oracle Cloud provides a broad portfolio of Platform Services, Application Services, and Social Services, all on a subscription basis. Oracle Cloud delivers instant value and productivity for end users, administrators, and developers through functionally rich, integrated, secure, enterprise cloud services. Supporting Quotes “BigMachines is a leader in Configure, Price, and Quote solutions in the Cloud. Our solution delivers accurate quotes directly from an opportunity, integrated with the leading Oracle Sales and Marketing Cloud application from Oracle,” says John Pulling, Senior Vice President of Products at Big Machines. “Together, Big Machines and Oracle efficiently automate changes, enabling a faster, more efficient sales process for our joint customers.”   ”Modern marketing and sales must engage customers and prospects in real time across the web, email, social media, online and offline channels to understand where and how to allocate their budgets for maximum return,” said Srini Venkatesan, Senior VP, Products and Engineering at Marketo. “Alignment and integration with Oracle Sales and Marketing Cloud Service allows Marketo’s solutions to deliver innovative capabilities for sales and marketing to adapt and grow their business on the core Oracle platform for CRM.”   “Sales incentives are the best way to drive better performance. Well managed incentives improve the bottom line, particularly when combined with effective sales systems,” said Christopher Cabrera, president and CEO of Xactly Corporation. “With Oracle Sales and Marketing Cloud Service and Xactly working together, customers gain insight and efficiencies. The combination can create more effective compensation programs, while motivating sales to work to its full potential."   “The tremendous integration of leading ISVs with Oracle Sales and Marketing Cloud Service is a testament to the undeniable business value and demand from customers,” said Anthony Lye, SVP of Oracle CRM. “Oracle Sales and Marketing Cloud Service continues to define the industry, and we are proud to work with these leading ISVs to help users simultaneously maximize sales and revenue and extend their current deployments for a deeper and single customer viewpoint.” Supporting Resources Oracle Sales and Marketing Cloud Service Learn More About Oracle Cloud

    Read the article

  • First Day of Data Integration Track at Oracle OpenWorld 2012

    - by Irem Radzik
    OpenWorld started full speed for us today with a great set of sessions in the Data Integration track. After the exciting keynote session on Oracle Database 12c in the morning; Brad Adelberg, VP of Development for Data Integration products, presented Oracle’s data integration product strategy. His session highlighted the new requirements for data integration to achieve pervasive and continuous access to trusted data. The new requirements and product focus areas presented in this session are: Provide access to any data at any source On premise or on cloud Enable zero downtime operations and maximum performance Leverage real-time data for accurate business insights And ensure high quality data is used across the enterprise During the session Brad walked over how Oracle’s data integration products, Oracle Data Integrator, Oracle GoldenGate, Oracle Enterprise Data Quality, and Oracle Data Service Integrator, deliver on these requirements and how recent product releases build on this strategy. Soon after Brad’s session we heard from a panel of Oracle GoldenGate customers, St. Jude Medical, Equifax, and Bank of America, how they achieved zero downtime operations using Oracle GoldenGate. The panel presented different use cases of GoldenGate, from Active-Active replication to offloading reporting. Especially St. Jude Medical’s implementation, which involves the alert management system for patients that use their pacemakers, reminded me in some cases downtime of mission-critical systems can be a matter of life or death. It is very comforting to hear that GoldenGate delivers highly-reliable continuous availability for life-saving medical systems. In the afternoon, Nick Wagner from the Product Management team and I followed the customer panel with the review of Oracle GoldenGate 11gR2’s New Features.  Many questions we received from audience were about GoldenGate’s new Integrated Capture for Oracle Database and the enhanced Conflict Management features, as well as how GoldenGate compares to Oracle Streams. In addition to giving details on GoldenGate’s unique capability to capture changed data with a direct integration to the Oracle DBMS engine, we reminded the audience that enhancements to Oracle GoldenGate will continue, while Streams will be primarily maintained. Last but not least, Tim Garrod and Ryan Fonnett from Raymond James presented a unified real-time data integration solution using Oracle Data Integrator and GoldenGate for their operational data store (ODS). The ODS supports application services across the enterprise and providing timely data is a critical requirement. In this solution, Oracle GoldenGate does the log-based change data capture for Oracle Data Integrator’s near real-time data integration between heterogeneous systems. As Raymond James’ ODS supports mission-critical services for their advisors, the project team had to set up this integration environment to be highly available. During the session, Ryan and Tim explained how they use ODI to enable automated process execution and “always-on” integration processes. Their presentation included 2 demonstrations that focused on CDC patterns deployed with ODI and the automated multi-instance execution and monitoring. We are very grateful to Tim and Ryan for their very-well prepared presentation at OpenWorld this year. Day 2 (Tuesday) will be also a busy day in our track. In addition to the Fusion Middleware Innovation Awards ceremony at 11:45am at Moscone West 3001, we have the following DI sessions Real-World Operational Reporting Customer Panel 11:45am Moscone West- 3005 Oracle Data Integrator Product Update and Future Strategy 1:15pm Moscone West- 3005 High-volume OLTP with Oracle GoldenGate: Best Practices from Comcast 1:15pm Moscone West- 3005 Everything You need to Know about Monitoring Oracle GoldenGate 5pm Moscone West-3005 If you are at OpenWorld please join us in these sessions. For a full review of data integration track at OpenWorld please see our Focus-On document.

    Read the article

  • Advantages of Terminal Server instead of normal client-Server installation?

    - by Sam
    What are the advantages of using a (Windows) Terminal Server and thin clients instead of using a normal Server and full clients? So far I've only really used normal servers and clients, but now customers ask about terminal Server, and I'd like to know pro's and con's of using them instead of an "old-fashioned" client-server network. Some things I can guess: easier administration (don't need to install/update office/stuff on 20 computers but only on the server). Easier backup (no need to backup client computers). And I'd guess it would be hard (impossible) to connect and use local (like USB) hardware with Terminal Server? What else are the reasons for or against switching to Terminal Server?

    Read the article

  • Cumulative Update #7 for SQL Server 2008 Service Pack 3 is available

    - by AaronBertrand
    Today Microsoft has released a new cumulative update for SQL Server 2008. Cumulative Update #7 for SQL Server 2008 Service Pack 3 Knowledge Base Article: KB #2738350 At the time of writing, there are 9 fixes listed The build number is 10.00.5794 Relevant for @@VERSION between 10.00.5500 and 10.00.5793 No word yet on an update for Service Pack 2. As usual, I'll post my standard disclaimer here: these updates are NOT for SQL Server 2008 R2 (where @@VERSION will report 10.50.xxxx)....(read more)

    Read the article

  • Execute a SSIS package in Sync or Async mode from SQL Server 2012

    - by Davide Mauri
    Today I had to schedule a package stored in the shiny new SSIS Catalog store that can be enabled with SQL Server 2012. (http://msdn.microsoft.com/en-us/library/hh479588(v=SQL.110).aspx) Once your packages are stored here, they will be executed using the new stored procedures created for this purpose. This is the script that will get executed if you try to execute your packages right from management studio or through a SQL Server Agent job, will be similar to the following: Declare @execution_id bigint EXEC [SSISDB].[catalog].[create_execution] @package_name='my_package.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'BI', @project_name=N'DWH', @use32bitruntime=False, @reference_id=Null Select @execution_id DECLARE @var0 smallint = 1 EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,  @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=@var0 DECLARE @var1 bit = 0 EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,  @object_type=50, @parameter_name=N'DUMP_ON_ERROR', @parameter_value=@var1 EXEC [SSISDB].[catalog].[start_execution] @execution_id GO The problem here is that the procedure will simply start the execution of the package and will return as soon as the package as been started…thus giving you the opportunity to execute packages asynchrously from your T-SQL code. This is just *great*, but what happens if I what to execute a package and WAIT for it to finish (and thus having a synchronous execution of it)? You have to be sure that you add the “SYNCHRONIZED” parameter to the package execution. Before the start_execution procedure: exec [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,  @object_type=50, @parameter_name=N'SYNCHRONIZED', @parameter_value=1 And that’s it . PS From the RC0, the SYNCHRONIZED parameter is automatically added each time you schedule a package execution through the SQL Server Agent. If you’re using an external scheduler, just keep this post in mind .

    Read the article

  • SQL Server 2008 R2 Express Edition - a treat for small scale businesses

    - by ssqa.net
    SQL Server Express edition is a light-weight software within SQL Server arena, it is classed as database platform that makes it easy to develop data-driven applications that are rich in capability, offer enhanced storage security, and are fast to deploy. Also the SQL Server 2008 Express with Advanced Services is an edition of same flock that includes a new graphical management tool, features for reporting, and advanced text-based search capabilities. You can add the GUI capabilities for management...(read more)

    Read the article

  • Review the New Migration Guide to SQL Server 2012 Always On

    - by KKline
    I had the pleasure of meeting Mr. Cephas Lin, of Microsoft, last year at the SQL Saturday in Indianapolis and then later at the PASS Summit in the fall. Cephas has been writing content for SQL Server 2012 Always On. Cephas has recently published his first whitepaper, a migration guide to SQL Server AlwaysOn. Read it and then pass along any feedback: HERE Enjoy, -Kev - Follow me on Twitter !...(read more)

    Read the article

  • December 2012 Cumulative Updates are available for SQL Server 2008 R2

    - by AaronBertrand
    Microsoft released new cumulative updates for SQL Server. SQL Server 2008 R2 Service Pack 1 Cumulative Update # 10 KB Article: KB #2783135 16 fixes are listed at the time of publication Build number is 10.50.2868 Relevant for @@VERSION 10.50.2500 through 10.50.2867 SQL Server 2008 R2 Service Pack 2 Cumulative Update # 4 KB Article: KB #2777358 34 fixes are listed at time of publication Build number is 10.50.4270 Relevant for @@VERSION 10.50.4000 through 10.50.4269 My usual disclaimer: these updates...(read more)

    Read the article

  • Microsoft Delivers Full Suite of SQL Server Powershell Cmdlets

    - by merrillaldrich
    We’ve all been waiting several years for this, and finally it’s here! Coinciding (approximately) with the release of SQL Server 2012, a new Feature Pack has appeared on the Microsoft web site that adds a full suite of PowerShell cmdlets for DDL and other functions. This means that, at last, we can do things like fully-featured SQL deployment scripts without all the (severe) limitations of T-SQL, such as primitive use of variables, flow control, exception handling. Taking a cue, finally, from the...(read more)

    Read the article

  • Cumulative Update #5 is available for SQL Server 2012 RTM

    - by AaronBertrand
    Microsoft has released Cumulative Update #5 for SQL Server 2012 RTM. Note this is *not* a cumulative update for Service Pack 1. So if your build # is >= 11.0.3000, you should not be installing this update. KB Article: KB #2777772 Build # 11.0.2395 28 fixes at the time of writing Relevant for builds 11.0.2100 -> 11.0.3329. Do not attempt to install on SQL Server 2012 SP1 (any build >= 11.0.3000) or any previous version of SQL Server....(read more)

    Read the article

  • The most dangerous SQL Script in the world!

    - by DrJohn
    In my last blog entry, I outlined how to automate SQL Server database builds from concatenated SQL Scripts. However, I did not mention how I ensure the database is clean before I rebuild it. Clearly a simple DROP/CREATE DATABASE command would suffice; but you may not have permission to execute such commands, especially in a corporate environment controlled by a centralised DBA team. However, you should at least have database owner permissions on the development database so you can actually do your job! Then you can employ my universal "drop all" script which will clear down your database before you run your SQL Scripts to rebuild all the database objects. Why start with a clean database? During the development process, it is all too easy to leave old objects hanging around in the database which can have unforeseen consequences. For example, when you rename a table you may forget to delete the old table and change all the related views to use the new table. Clearly this will mean an end-user querying the views will get the wrong data and your reputation will take a nose dive as a result! Starting with a clean, empty database and then building all your database objects using SQL Scripts using the technique outlined in my previous blog means you know exactly what you have in your database. The database can then be repopulated using SSIS and bingo; you have a data mart "to go". My universal "drop all" SQL Script To ensure you start with a clean database run my universal "drop all" script which you can download from here: 100_drop_all.zip By using the database catalog views, the script finds and drops all of the following database objects: Foreign key relationships Stored procedures Triggers Database triggers Views Tables Functions Partition schemes Partition functions XML Schema Collections Schemas Types Service broker services Service broker queues Service broker contracts Service broker message types SQLCLR assemblies There are two optional sections to the script: drop users and drop roles. You may use these at your peril, particularly as you may well remove your own permissions! Note that the script has a verbose mode which displays the SQL commands it is executing. This can be switched on by setting @debug=1. Running this script against one of the system databases is certainly not recommended! So I advise you to keep a USE database statement at the top of the file. Good luck and be careful!!

    Read the article

  • T-SQL Tuesday #007 and T-SQL Tuesday Has a Logo

    - by Adam Machanic
    This month’s T-SQL Tuesday is hosted by Jorge Segarra, the “SQL Chicken.” The topic is rather open ended: What is your favorite new(ish) SQL Server feature? Love the DACPAC? Can’t wait for PDW? Post about it and tell us why! In other T-SQL Tuesday news, we now have a logo. Those of you who are participating in the event, take notice; the rules have changed. Now that we have a logo we’re simplifying the linkback and subject guidelines a bit. Henceforth you can title your post however you want. It...(read more)

    Read the article

  • 24 hours of PASS is back!

    - by Sergio Govoni
    The most important free on-line event on SQL Server and Business Intelligence is back! The 24 Hours of PASS is coming back with a great edition fully based on the new features of SQL Server 2014. What could you aspect from the next PASS Summit? Find it out on June 25, 2014 (12:00 GMT) on 24 Hours of PASS: SQL Server 2014! Register now at this link. No matter from what part of the world you will follow the event, the important thing is to know that it will be 24 hours of continuous training on SQL Server and Business Intelligence.

    Read the article

  • Creating a SQL Azure Database Should be Easier

    - by Ken Cox [MVP]
    Every time I try to create a database + tables + data for Windows Azure SQL I get errors.  One of them is 'Filegroup reference and partitioning scheme' is not supported in this version of SQL Server.' It’s partly due to my poor memory (since I’ve succeeded before) and partly due to the failure of tools that should be helping me. For example, when I want to create a script from an existing database on my local workstation, I use SQL Server Management Studio (currently v 11.0.2100.60).  I go to Tasks > Generate Scripts which brings up the nice Generate and Publish Scripts wizard. When I go into the Advanced button, under Script for Server Version, why don’t I see SQL Azure as an option by now? The tool should be sorting this out for me, right? Maybe this is available in SQL Server Data Tools? I haven’t got into that yet. Just merge the functionality with SSMS, please. Anyway, I pick an older version of SQL for the target and still need to tweak it for Azure. For example, I take out all the “[dbo].” stuff. Why is it put there by the wizard? I also have to get rid of "ON [PRIMARY]"  to deal with the error I noted at the top. Yes, there’s information on what a table needs to look like in SQL Azure but the tools should know this so I don’t have to mess with it.

    Read the article

  • Faster Trip to Innovation with Simplified Data Integration: Sabre Holdings Case Study

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Author: Irem Radzik, Director of Product Marketing, Data Integration, Oracle In today’s fast-paced, competitive environment, IT teams are under pressure to deliver technology solutions for many critical business initiatives as fast as possible. When the focus is on speed, it can be easy to continue to use old style, point-to-point custom scripts that grow organically to the point where they are unmanageable and too costly to maintain. As data volumes, data sources, and end users grow, uncoordinated data integration efforts create significant inefficiencies for both IT and business users. In addition to losing IT productivity due to maintaining spaghetti architecture, data integrity becomes a concern as well. Errors caused by inconsistent, data and manual data entry can prove very costly for companies and disrupt business activities. Many industry leaders recognize now that data should be moved in an automated and reliable manner across all platforms to have one version of the truth. By simplifying their data integration architecture and standardizing on a centralized approach, IT teams now accelerate time to market. Especially, using a centralized, shared-service approach brings agility, increases IT productivity, and frees up resources for innovation. One such industry leader that simplified its data integration architecture is Sabre Holdings. Sabre Holdings provides distribution and technology solutions for the travel industry, and is a winner of Oracle Excellence Awards for Fusion Middleware in 2011 in the data integration category. I had the pleasure to host Sabre Holdings on a public webcast and discuss their data integration best practices for data warehousing. In this webcast Sabre’s Amjad Saeed, presented how the company reduced complexity by consolidating systems and standardizing development on Oracle Data Integrator and Oracle GoldenGate for its global data warehouse development team. With Oracle’s complete real-time data integration solution, Sabre also streamlined support and maintenance operations, achieved real-time view in the execution of the integration processes, and can manage the data warehouse and business intelligence solution performance on demand. By reducing complexity and leveraging timely market insights, the company was able to decrease time to market by 40%. You can now listen to the webcast on demand: Sabre Holdings Case Study: Accelerating Innovation using Oracle Data Integration I invite you to hear directly from Sabre how to use advanced data integration capabilities to enable accelerated innovation. To learn more about Oracle’s data integration offering you can download our free resources.

    Read the article

  • Application runs with different speed for two different user logins on windows server

    - by karthi
    We have a application in Windows Server that download data from SQL server and store in our local machine. Now the problem I have is my Windows Server has two logins and in one login the app runs real slow, like gets 1 row in a minute, and from another login it fetches 20 rows in a minute. We have this problem only for a last couple of days. What could have caused this? More details: SQL server:S QL server2008 Os: Windows server 2008 Access method: Remote connection. Application: Custom .Net application to get data Both accounts are limited rights accounts. Any tools to track this? I am not sure what should I start with.

    Read the article

  • T-SQL Tuesday #34: HELP!

    - by merrillaldrich
    I owe my career to the SQL Server community, specifically the Internet SQL Server community, so this month’s T-SQL Tuesday is especially poignant. I changed careers “cold” about eight years ago, and, while I had some educational background in computer science, I had relatively little real-world DBA experience. Someone gave me a shot in the form of an entry level job, for which I am grateful, but I also had to make the argument to him that I would figure out whatever I needed to do to be successful...(read more)

    Read the article

  • If a SQL Server Replication Distributor and Subscriber are on the same server, should a PUSH or PULL subsciption be used?

    - by userx
    Thanks in advance for any help. I'm setting up a new Microsoft SQL Server replication and I have the Distributor and Subscriber running on the same server. The Publisher is on a remote server (as it is a production database and MS recommends that for high volumes, the Distributor should be remote). I don't know much about the inner workings of PUSH vs PULL subscriptions, but my gut tells me that a PUSH subscription would be less resource intensive because (1) the Distributor is already remote, so this shouldn't negatively effect the Publisher and (2) pushing the transactions from the Distributor to the Subscriber is more efficient than the Subscriber polling the Distribution database. Does any one have any resources or insight into PUSH vs PULL which would recommend one over the other? Is there really going to be that big of a difference in performance / reliability / security?

    Read the article

  • T-SQL - Is there a (free) way to compare data in two tables?

    - by RPM1984
    Okay so i have table a and table b. (SQL Server 2008) Both tables have the exact same schema. For the purposes of this question, consider table a = my local dev table, table b = the live table. I need to create a SQL script (containing UPDATE/DELETE/INSERT statements) that will update table b to be the same as table a. This script will then be deployed to the live database. Any free tools out there that can do this, or better yet a way i can do it myself? I'm thinking i probably need to do some type of a join on all the fields in the tables, then generate dynamic sql based on that. Anyone have any ideas?

    Read the article

  • Using both IIS6 and IIS7 with the same SQL State Server

    - by Josef
    We are trying to use new IIS7 (32bit, Classic Mode) webs in addition to our IIS6 webs with one SQL State Server for ASP.NET Session Handling. Unfortunately the number of transactions per seconds in the State Servers spikes (10 times+) as soon as we add the new IIS7 web to the farm. Are there any known issues with the described setup?

    Read the article

  • Problem installing SQL Server client tools

    - by Shiraz Bhaiji
    We are tring to install SQL Server 2005 Standard on Windows 2008 Standard both 64 bit. Have done this before with no problems. This time we get an error during the installation of client tools: There was an unexpected failure during the setup wizard Link Id 20476, message ID 50000 There are no errors in the event log. Anybody have any ideas what could be wrong?

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >