Search Results

Search found 63598 results on 2544 pages for 'sql add on'.

Page 423/2544 | < Previous Page | 419 420 421 422 423 424 425 426 427 428 429 430  | Next Page >

  • SSIS Virtual Class

    - by ejohnson2010
    I recorded a Virtual SSIS Class with the good folks over at SSWUG and the first airing of the class will by May 15th. This is 100% online so you can do it on your own time and from anywhere. The class will run monthly and I will be available for questions through out. You get the following 12 sessions on SSIS, each about an hour. Session 1: The SSIS Basics Session 2: Control Flow Basics Session 3: Data Flow - Sources and Destinations Session 4: Data Flow - Transformations Session 5: Advanced Transformations...(read more)

    Read the article

  • Paging using Linq-To-Sql based on two parameters in asp.net mvc...

    - by Pandiya Chendur
    As two parameters i say currentPage and pagesize .....I thus far used sql server stored procedures and implemented paging like this, GO ALTER PROCEDURE [dbo].[GetMaterialsInView] -- Add the parameters for the stored procedure here @CurrentPage INT, @PageSize INT AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; SELECT *,ROW_NUMBER() OVER (ORDER BY Id) AS Row FROM ( SELECT *,ROW_NUMBER() OVER (ORDER BY Id) AS Row FROM InTimePagingView ) AS InTimePages WHERE Row >= (@CurrentPage - 1) * @PageSize + 1 AND Row <= @CurrentPage*@PageSize SELECT COUNT(*) as TotalCount FROM InTimePagingView SELECT CEILING(COUNT(*) / CAST(@PageSize AS FLOAT)) NumberOfPages FROM InTimePagingView END Now i am using Linq-to-sql and i use this, public IQueryable<MaterialsObj> FindAllMaterials() { var materials = from m in db.Materials join Mt in db.MeasurementTypes on m.MeasurementTypeId equals Mt.Id where m.Is_Deleted == 0 select new MaterialsObj() { Id = Convert.ToInt64(m.Mat_id), Mat_Name = m.Mat_Name, Mes_Name = Mt.Name, }; return materials; } Now i want to return the records,TotalCount where i use Total count to generate pagenumbers..... Is this possible... Any suggestion... EDIT: Just found this... NorthWindDataContext db = new NorthWindDataContext(); var query = from c in db.Customers select c.CompanyName; //Assuming Page Number = 2, Page Size = 10 int iPageNum = 2; int iPageSize = 10; var PagedData = query.Skip((iPageNum - 1) * iPageSize).Take(iPageSize); ObjectDumper.Write(PagedData);

    Read the article

  • A better way to search Connect

    - by AaronBertrand
    I recently spotted a comment from Microsoft on a Connect item with 13 total up-votes . The comment went something like, "wow, due to the explosive response to this issue, we're going to deal with it right away." Okay, it wasn't that emphatic, it was actually: "I've brought the MVP customer vote count to the attention of dev, and a new owner of this DMV says he will dig up some info for us." Still, knowing that I had seen other items with a much stronger response and barely a note of acknowledgment...(read more)

    Read the article

  • Are ternary operators not valid for linq-to-sql queries?

    - by KallDrexx
    I am trying to display a nullable date time in my JSON response. In my MVC Controller I am running the following query: var requests = (from r in _context.TestRequests where r.scheduled_time == null && r.TestRequestRuns.Count > 0 select new { id = r.id, name = r.name, start = DateAndTimeDisplayString(r.TestRequestRuns.First().start_dt), end = r.TestRequestRuns.First().end_dt.HasValue ? DateAndTimeDisplayString(r.TestRequestRuns.First().end_dt.Value) : string.Empty }); When I run requests.ToArray() I get the following exception: Could not translate expression ' Table(TestRequest) .Where(r => ((r.scheduled_time == null) AndAlso (r.TestRequestRuns.Count > 0))) .Select(r => new <>f__AnonymousType18`4(id = r.id, name = r.name, start = value(QAWebTools.Controllers.TestRequestsController). DateAndTimeDisplayString(r.TestRequestRuns.First().start_dt), end = IIF(r.TestRequestRuns.First().end_dt.HasValue, value(QAWebTools.Controllers.TestRequestsController). DateAndTimeDisplayString(r.TestRequestRuns.First().end_dt.Value), Invoke(value(System.Func`1[System.String])))))' into SQL and could not treat it as a local expression. If I comment out the end = line, everything seems to run correctly, so it doesn't seem to be the use of my local DateAndTimeDisplayString method, so the only thing I can think of is Linq to Sql doesn't like Ternary operators? I think I've used ternary operators before, but I can't remember if I did it in this code base or another code base (that uses EF4 instead of L2S). Is this true, or am I missing some other issue?

    Read the article

  • Chapter 7–Enforced Data Protection

    - by drsql
    As the book progresses, I find myself veering from the original stated outline quite a bit, because as I teach about this more (and I am teaching a daylong db design class in August at http://www.sqlsolstice.com/ … shameless plug, but it is on topic :) I start to find that a given order works better. Originally I had slated myself to talk more about modeling here for three chapters, then get back to the more implementation topics to finish out the book, but now I am going to keep plugging through...(read more)

    Read the article

  • Using Coalesce

    - by Derek Dieter
    The coalesce function is used to find the first non-null value. The function takes limitless number of parameters in order to evaluate the first non null. If all the parameters are null, then COALESCE will also return a NULL value.-- hard coded example SELECT MyValue = COALESCE(NULL, NULL, 'abc', 123)The example above returns back [...]

    Read the article

  • SQL Server 2012 Integration Services - Using PowerShell to Configure Project Environments

    Continuing our discussion on how to leverage the capabilities of PowerShell to automate the most basic SSIS management tasks, this article will explore more complex topics by demonstrating the use of PowerShell in implementing and utilizing project environments. ‘Disturbing Development’Grant Fritchey & the DBA Team present the latest installment of the Top 5 hard-earned lessons of a DBA – read it now

    Read the article

  • SQL Server Integration Services Connection Manager Tips and Tricks

    In this article, we will take a look at the following Tips and Tricks for Connection Managers: Adding an "Application Name" property to the connection string; Creating Two Connection Managers for each Database Connection; and Capturing Connection Manager details in Package Configurations Deployment Manager 2 is now free!The new version includes tons of new features and we've launched a completely free Starter Edition! Get Deployment Manager here

    Read the article

  • More Tables or More Databases?

    - by BuckWoody
    I got an e-mail from someone that has an interesting situation. He has 15,000 customers, and he asks if he should have a database for their data per customer. Without a LOT more data it’s impossible to say, of course, but there are some general concepts to keep in mind. Whenever you’re segmenting data, it’s all about boundary choices. You have not only boundaries around how big the data will get, but things like how many objects (tables, stored procedures and so on) that will be involved, if there are any cross-sections of data (do they share location or product information) and – very important – what are the security requirements? From the answer to these types of questions, you now have the choice of making multiple tables in a single database, or using multiple databases. A database carries some overhead – it needs a certain amount of memory for locking and so on. But it has a very clean boundary – everything from objects to security can be kept apart. Having multiple users in the same database is possible as well, using things like a Schema. But keeping 15,000 schemas can be challenging as well. My recommendation in complex situations like this is similar to a post on decisions that I did earlier – I lay out the choices on a spreadsheet in rows, and then my requirements at the top in the columns. I  give each choice a number based on how well it meets each requirement. At the end, the highest number wins. And many times it’s a mix – perhaps this person could segment customers into larger regions or districts or products, in a database. Within that database might be multiple schemas for the customers. Of course, he needs to query across all customers, that becomes another requirement. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Between-request Garbage Collection using Passenger

    - by raphaelcm
    We're using Rails 3.0.7 and REE 1.8.7. Long-term, we will be upgrading, but at the moment it's not feasible. Following the advice of several blog posts, we've been tuning our GC, and have settings that work pretty well. But we would really like to run GC outside of the request-response cycle. I've tried patching Passenger per this post, and using the code supplied in this SO question. In both cases, GC does indeed happen between requests. However, every time the between-request GC happens, I see a bunch of this: MONGODB [INFO] Connecting... MONGODB admin['$cmd'].find({:ismaster=>1}).limit(-1) MONGODB admin['$cmd'].find({:ismaster=>1}).limit(-1) MONGODB admin['$cmd'].find({:ismaster=>1}).limit(-1) Starting the New Relic Agent. Installed New Relic Browser Monitoring middleware SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES RefinerySetting Load (0.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` WHERE `refinery_settings`.`scoping` = 'pages' AND `refinery_settings`.`name` = 'use_marketable_urls' LIMIT 1 SQL (0.0ms) BEGIN RefinerySetting Load (0.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` WHERE `refinery_settings`.`id` = 1 LIMIT 1 AREL (0.0ms) UPDATE `refinery_settings` SET `value` = '--- \"false\"\n', `callback_proc_as_string` = NULL WHERE `refinery_settings`.`id` = 1 SQL (0.0ms) SHOW TABLES RefinerySetting Load (0.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` SQL (0.0ms) COMMIT SQL (0.0ms) SHOW TABLES RefinerySetting Load (4.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` WHERE `refinery_settings`.`scoping` IS NULL AND `refinery_settings`.`name` = 'user_image_sizes' LIMIT 1 SQL (0.0ms) BEGIN RefinerySetting Load (0.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` WHERE `refinery_settings`.`id` = 17 LIMIT 1 AREL (0.0ms) UPDATE `refinery_settings` SET `value` = '--- \n:small: 120x120>\n:medium: 280x280>\n:large: 580x580>\n', `callback_proc_as_string` = NULL WHERE `refinery_settings`.`id` = 17 SQL (0.0ms) SHOW TABLES RefinerySetting Load (0.0ms) SELECT `refinery_settings`.* FROM `refinery_settings` SQL (0.0ms) COMMIT ******** Engine Extend: app/helpers/blog_posts_helper SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (4.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES ******** Engine Extend: app/models/user SQL (0.0ms) describe `roles_users` SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (4.0ms) describe `roles_users` SQL (0.0ms) SHOW TABLES SQL (4.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES SQL (0.0ms) SHOW TABLES (etc, etc, etc) Which is what happens when rails "loads the world" when the app starts up. Basically, GC.start is re-loading the app for some reason. Because of this, between-request GC is much slower than inline GC. Is there a way around this? I would love to have snappy, between-request GC if possible. Thanks.

    Read the article

  • Outlook 2007 VSTO Add-in deployed by click-once doesn't detect published updates

    - by Matt
    I have created an outlook 2007 add-in project in vs2008, targeting .net 3.5, then migrated the project to vs2010. I have then published the project from vs2010 to a web site, and installed the add-in using click-once to a virtual machine running xp, .net 3.5 sp1, and outlook 2007. This all works great and I can see my add-in within outlook. Publish update settings are set to update the add-in at startup rather than every 7 days. However when I then make a simple change to the add-in, update the AssemblyVersion and AssemblyFileVersion of the add-in project, and then publish the updates, when I run outlook it doesn't detect that there is a new version, and just runs the current one that is installed. I can see that the publish has generated a new setup.exe and added a new folder to the 'Application Files' folder with the current (autogenerated) publish version. Can anyone suggest anything how I can get the update to be deployed to the client?

    Read the article

  • What strategy should be employed to access Facebook data offline?

    - by user686021
    I'm working on a project similar to Klout which provides detail about how you influence other people and who influenced you. We'll be fetching data from few social networking sites (i.e linked in, facebook, twitter etc) to analyze how users interacts with one another. For that we need to parse the data and store it in db and have to analyze it so that strength of relation of two user can be decided. We'll be accessing data offline as well to provide them with accurate results. If we consider facebook activities, we need to have access to Facebook users' news feed, wall data which includes likes,comments,shares etc. To decide how one user influence other, we'll store all the data and analyze it. I need suggestions on what steps need to be taken for great performance. We'll be using ASP.Net(C#) Web forms, SQL Server, jQuery. Main concern is parsing of data, it's storage and retrieval with least overhead. For that I've summarized few points as below : Should we switch over to document-oriented database, like MongoDB or RavenDB for the whole app or part of it even though none of team member have experience with them? Should we use SQL Server Analysis service? Is there any other library than Json.NET for parsing data? Is it advisable to use any C# library over FQL + GET Request ? I've tried to provide as much info as possible. Please share your views for the same.

    Read the article

  • Loading city/state from SQL Server to Google Maps?

    - by knawlejj
    I'm trying to make a small application that takes a city & state and geocodes that address to a lat/long location. Right now I am utilizing Google Map's API, ColdFusion, and SQL Server. Basically the city and state fields are in a database table and I want to take those locations and get marker put on a Google Map showing where they are. This is my code to do the geocoding, and viewing the source of the page shows that it is correctly looping through my query and placing a location ("Omaha, NE") in the address field, but no marker, or map for that matter, is showing up on the page: function codeAddress() { <cfloop query="GetLocations"> var address = document.getElementById(<cfoutput>#Trim(hometown)#,#Trim(state)#</cfoutput>).value; if (geocoder) { geocoder.geocode( {<cfoutput>#Trim(hometown)#,#Trim(state)#</cfoutput>: address}, function(results, status) { if (status == google.maps.GeocoderStatus.OK) { var marker = new google.maps.Marker({ map: map, position: results[0].geometry.location, title: <cfoutput>#Trim(hometown)#,#Trim(state)#</cfoutput> }); } else { alert("Geocode was not successful for the following reason: " + status); } }); } </cfloop> } And here is the code to initialize the map: var geocoder; var map; function initialize() { geocoder = new google.maps.Geocoder(); var latlng = new google.maps.LatLng(42.4167,-90.4290); var myOptions = { zoom: 5, center: latlng, mapTypeId: google.maps.MapTypeId.ROADMAP } var marker = new google.maps.Marker({ position: latlng, map: map, title: "Test" }); map = new google.maps.Map(document.getElementById("map_canvas"), myOptions); } I do have a map working that uses lat/long that was hard coded into the database table, but I want to be able to just use the city/state and convert that to a lat/long. Any suggestions or direction? Storing the lat/long in the database is also possible, but I don't know how to do that within SQL.

    Read the article

  • Extension to add button "Report to Bugzilla"?

    - by Alois Mahdal
    We have: internal MediaWiki installation for internal documents (we don't use it in completely wiki-like style—only maintainers should normally make changes) internal Bugzilla installation for internal issues including these internal documents on the MediaWiki site Now only the icing on the cake is missing: an automatic button that would appear on each page, being able to open a Bugzilla page pre-fill some fields with information about that page Basically, name What I imagine as a best solution would be a sibling to the ubiquitous "[edit]" button, probably sitting next to it, like in this mock-up:

    Read the article

  • cannot add svn addon (Subclipse)

    - by Ubuntuser
    Hi, I am trying to install Subclipse plugins for eclipse IDE. I have installed it but on restart of the IDE , it throws up the following error. Failed to load JavaHL Library. These are the errors that were encountered: no libsvnjavahl-1 in java.library.path no svnjavahl-1 in java.library.path no svnjavahl in java.library.path java.library.path = /usr/lib/jvm/java-6-sun-1.6.0.24/jre/lib/i386/client:/usr/lib/jvm/java-6-sun-1.6.0.24/jre/lib/i386::/usr/java/packages/lib/i386:/lib:/usr/lib how do I get past this error?

    Read the article

  • In SQL Server what is most efficient way to compare records to other records for duplicates with in

    - by Glenn
    We have an SQL Server that gets daily imports of data files from clients. This data is interrelated and we are always scrubbing it and having to look for suspect duplicate records between these files. Finding and tagging suspect records can get pretty complicated. We use logic that requires some field values to be the same, allows some field values to differ, and allows a range to be specified for how different certain field values can be. The only way we've found to do it is by using a cursor based process, and it places a heavy burden on the database. So I wanted to ask if there's a more efficient way to do this. I've heard it said that there's almost always a more efficient way to replace cursors with clever JOINS. But I have to admit I'm having a lot of trouble with this one. For a concrete example suppose we have 1 table, an "orders" table, with the following 6 fields. order_id, customer_id product_id, quantity, sale_date, price We want to look through the records to find suspect duplicates on the following example criteria. These get increasingly harder. 1. Records that have the same product_id, sale_date, and quantity but different customer_id's should be marked as suspect duplicates for review. 2. Records that have the same customer_id, product_id, quantity and have sale_dates within five days of each other should be marked as suspect duplicates for review 3. Records that have the same customer_id, product_id, but different quantities within 20 units, and sales dates within five days of each other should be considered suspect. Is it possible to satisfy each one of these criteria with a single SQL Query that uses JOINS? Is this the most efficient way to do this?

    Read the article

  • Sql Server 2005 Database Tables - Row Comparison Column By Column.

    - by Goober
    Scenario I have an TWO datbase tables of exactly the SAME STRUCTURE. The difference between these tables is that one contains data populated by one application and the other is populated by a different application. Each application is trying to produce the same result, but using two different methods of implementation. Proposed Idea What I want to do, is run both applications, which will roughly produce 35000 rows containing 10 columns each - So all in all, 70000 rows of data, I then want to compare each row of data, COLUMN BY COLUMN to check whether the values are the same or not. Current Thoughts Since there is so much data to compare, I feel that the best way in which to do this would be to write an application, preferably in C# (but if necessary, T-sql), to compare each row of data column by column, and write out any failed comparisons to a text log file. Question Could anybody suggest an efficient way in which to perform column by column row comparison for 70000 rows worth of data? I'm struggling for ideas on how to tackle this problem. Extra Detail The two applications are both written in C# .Net 3.5. The Database is running on Sql Server 2005. Help greatly appreciated.

    Read the article

  • What is the fastest way to get a DataTable into SQL Server?

    - by John Gietzen
    I have a DataTable in memory that I need to dump straight into a SQL Server temp table. After the data has been inserted, I transform it a little bit, and then insert a subset of those records into a permanent table. The most time consuming part of this operation is getting the data into the temp table. Now, I have to use temp tables, because more than one copy of this app is running at once, and I need a layer of isolation until the actual insert into the permanent table happens. What is the fastest way to do a bulk insert from a C# DataTable into a SQL Temp Table? I can't use any 3rd party tools for this, since I am transforming the data in memory. My current method is to create a parameterized SqlCommand: INSERT INTO #table (col1, col2, ... col200) VALUES (@col1, @col2, ... @col200) and then for each row, clear and set the parameters and execute. There has to be a more efficient way. I'm able to read and write the records on disk in a matter of seconds...

    Read the article

  • Can you modify SQL DB schema in a transaction to know if all changes were applied?

    - by Chris F
    As part of my (new) database version control methodology, I'm writing a "change script" and want the change script to insert a new row into the SchemaChangeLog table if the script is executed successfully, or to reverse changes if any single change in the script fails. Is it possible to do schema changes in a transaction and only if it gets committed to then do the INSERT? For example (psuedo-code, I'm not too good with SQL): SET XACT_ABORT ON BEGIN TRANSACTION PRINT 'Add Col2 to Table1' IF NOT EXIST (SELECT * FROM sys.columns WHERE NAME='Col2' AND object_id=OBJECT_ID('Table1')) BEGIN ALTER TABLE [dbo].[Table1] ADD Col2 int NULL END -- maybe COMMIT here? INSERT INTO SchemaChangeLog VALUES(...) COMMIT TRANSACTION

    Read the article

  • A few problems with Delphi involving Mail Merge, SQL + Databases.

    - by Daniel
    My first problem is with mail merge. I have created a a Data File and a table, yet I am not able to fill my table with information from my Data File. The << just seems to be inserted after wherever the cursor is on the page, which is not where the table is. All that is entered into the actual table is a '59'. Therefore I think I either need to to change the code or be able to move the cursor. Here is the code I am currently using: wrdDoc.Tables.Add(wrdSelection.Range, ADOTable1.FieldCount, 3); wrdDoc.Tables.Item(1).Columns.Item(1).SetWidth(51,wdAdjustNone); wrdDoc.Tables.Item(1).Columns.Item(2).SetWidth(20,wdAdjustNone); wrdDoc.Tables.Item(1).Columns.Item(3).SetWidth(100,wdAdjustNone); // Set the shading on the first row to light gray wrdDoc.Tables.Item(1).Rows.Item(1).Cells .Shading.BackgroundPatternColorIndex := wdGray25; // BOLD the first row wrdDoc.Tables.Item(1).Rows.Item(1).Range.Bold := True; // Center the text in Cell (1,1) wrdDoc.Tables.Item(1).Cell(1,1).Range.Paragraphs.Alignment := wdAlignParagraphCenter; // Fill each row of the table with data wrdDoc.Tables.Item(1).Cell(1, 1).Range.InsertAfter('Time'); wrdDoc.Tables.Item(1).Cell(1, 2).Range.InsertAfter(''); wrdDoc.Tables.Item(1).Cell(1, 3).Range.InsertAfter('Teacher'); For Count := 1 to (ADOTable1.FieldCount - 1) do begin wrdDoc.Tables.Item(1).Cell((Count + 1), 1).Range.InsertAfter(wrdSelection.Range,'Time' + IntToStr(Count)); wrdDoc.Tables.Item(1).Cell((Count + 1), 2).Range.InsertAfter(wrdSelection.Range,'THonorific' + IntToStr(Count)); wrdDoc.Tables.Item(1).Cell((Count + 1), 3).Range.InsertAfter(wrdSelection.Range,'TSurname' + IntToStr(Count)); end; My second problem is that I do not know what the correct SQL syntax is for editing the name of a column in the database (I am using Delphi 7 and Microsoft Jet Engine if that makes a difference). The third problem is that when I add a new column to my database manually (which I need to do) I get a 'violation' error in one of my units when I activate an ADOTable. This only happens on one unit and it happens when I add a column with any name anywhere in the table. I know that is vague but I can't seem to narrow down the problem any further than that. If you could help with me with any of those it would be great. Thanks.

    Read the article

  • Shared Datasets in SQL Server 2008 R2

    This article leverages the examples and concepts explained in the Part I through Part IV of the spatial data series which develops a "BI-Satellite" app. Overview In the spatial data series we ... [Read Full Article]

    Read the article

  • How to Add a Google Call Widget to Any Web Page

    - by babblescribe
    Adding a Google Call Widget to your website or blog allows visitors to contact you using your Google Voice number. The widget provides an easy and cost-effective way to provide live customer support without the customer knowing your real number. The Call Widget works using Google Voice to first call the number the customer types into the widget form. Once connected, the user is prompted to connect to the number you have configured the widget to call. Google voice connects the two numbers and you are talking away in an instant.How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

< Previous Page | 419 420 421 422 423 424 425 426 427 428 429 430  | Next Page >