Search Results

Search found 16890 results on 676 pages for '2008 archive'.

Page 272/676 | < Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >

  • Is SSIS able to query flat files from another Windows Server?

    - by atricapilla
    I pretty new SQL Server Integration Server (SSIS) user. Is SSIS able to query data from text files located in another Windows Server? I mean that when SSIS is installed on Windwos Server A, is SSIS able to query data from e.g. one folder containing text files in Windows Server B (under same domain)? I have used only SAP BO Data Integrator ETL tool and it cannot query flat files from another Server: during execution, all files must be located on the Job Server machine that executes the job.

    Read the article

  • how to connect sqlserver cubes using dotnet(C#)

    - by prince23
    hi. i am new to this cubes concept in sqlserver . i need to connect to cubes and and query and get an result and display that result in grid view any help would be great telling how to connect to an cube, articles on it, code any thing that can help me to achieve the result thank you.

    Read the article

  • Upgrade from .NET 2.0 to .NET 3.5 problems

    - by Bashir Magomedov
    I’m trying to upgrade our solution from VS2005 .NET 2.0 to VS2008 .NET 3.5. I converted the solution using VS2008 conversion wizard. All the projects (about 50) remained targeting to .NET Framework 2.0., moreover if I’m changing target framework manually for one of the projects, all referenced dll (i.e. System, System.Core, System.Data, etc. are still pointing to Framework 2.0. The only way to completely change targeting framework I found is to remove these references and refer them again using proper version of framework. Doing it manually is not best choice I think. 50 projects ~ 10 references each ~ 0.5 minutes for changing each reference is about 5 hours to complete. Am I missing something? Are there any other ways of converting full solution from .NET 2.0 to .NET 3.5? Thank you.

    Read the article

  • filling in the holes in the result of a query

    - by ????? ????????
    my query is returning: +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ | Jan | Feb | Mar | Apr | May | Jun | Jul | Aug | Sep | Oct | Nov | Dec | Bla | +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 13 | | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 14 | | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 8 | 37 | 29 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 374 | 30 | | 0 | 0 | 1 | 0 | 78 | 2 | 4 | 8 | 57 | 169 | 116 | 602 | 31 | | 156 | 255 | 79 | 75 | 684 | 325 | 289 | 194 | 407 | 171 | 584 | 443 | 32 | | 1561 | 2852 | 2056 | 796 | 2004 | 1755 | 879 | 1052 | 1490 | 1683 | 2532 | 2381 | 33 | | 4167 | 3841 | 4798 | 3399 | 4132 | 5849 | 3157 | 4381 | 4424 | 4487 | 4178 | 5343 | 34 | | 5472 | 5939 | 5768 | 4150 | 7483 | 6836 | 6346 | 6288 | 6850 | 7155 | 5706 | 5231 | 35 | | 5749 | 4741 | 5264 | 4045 | 6544 | 7405 | 7524 | 6625 | 6344 | 5508 | 6513 | 3854 | 36 | | 5464 | 6323 | 7074 | 4861 | 7244 | 6768 | 6632 | 7389 | 8077 | 8745 | 6738 | 5039 | 37 | | 5731 | 7205 | 7476 | 5734 | 9103 | 9244 | 7339 | 8970 | 9726 | 9089 | 6328 | 5512 | 38 | | 7262 | 6149 | 8231 | 6654 | 9886 | 9834 | 9306 | 10065 | 9983 | 9984 | 6738 | 5806 | 39 | | 5886 | 6934 | 7137 | 6978 | 9034 | 9155 | 7389 | 9437 | 9711 | 8665 | 6593 | 5337 | 40 | +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ as you can see the BLA column starts from 13. i want it to start from 1, then 2, then 3 etc......I do not want any gaps in the data. The reason there are gaps is because all of the months are 0 for that specific bla how do i get the result set to include ALL values for BLA, even ones that will yield 0 for the months? here are the desired results: +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ | Jan | Feb | Mar | Apr | May | Jun | Jul | Aug | Sep | Oct | Nov | Dec | Bla | +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 | | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 14 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15 | | … | … | … | … | … | … | … | … | … | … | … | … | … | +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ here's my query: WITH CTE AS ( select sum(case when datepart(month,[datetime entered]) = 1 then 1 end) as Jan, sum(case when datepart(month,[datetime entered]) = 2 then 1 end) as Feb, sum(case when datepart(month,[datetime entered]) = 3 then 1 end) as Mar, sum(case when datepart(month,[datetime entered]) = 4 then 1 end) as Apr, sum(case when datepart(month,[datetime entered]) = 5 then 1 end) as May, sum(case when datepart(month,[datetime entered]) = 6 then 1 end) as Jun, sum(case when datepart(month,[datetime entered]) = 7 then 1 end) as Jul, sum(case when datepart(month,[datetime entered]) = 8 then 1 end) as Aug, sum(case when datepart(month,[datetime entered]) = 9 then 1 end) as Sep, sum(case when datepart(month,[datetime entered]) = 10 then 1 end) as Oct, sum(case when datepart(month,[datetime entered]) = 11 then 1 end) as Nov, sum(case when datepart(month,[datetime entered]) = 12 then 1 end) as Dec, DATEPART(yyyy,[datetime entered]) as [Year], bla= CASE WHEN datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108) >191 THEN 192 ELSE datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108) END --,datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE)) AS Sort_Days, --DATEPART(hour, [datetime completed] ) AS Sort_Hours from [TurnAround] group by datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108), DATEPART(yyyy,[datetime entered]) , [datetime entered] --[DateTime Completed] ) SELECT ISNULL(SUM(Jan),0) Jan, ISNULL(SUM(Feb),0) Feb, ISNULL(SUM(Mar),0) Mar, ISNULL(SUM(Apr),0) Apr, ISNULL(SUM(May),0) May, ISNULL(SUM(Jun),0) Jun, ISNULL(SUM(Jul),0) Jul, ISNULL(SUM(Aug),0) Aug, ISNULL(SUM(Sep),0) Sep, ISNULL(SUM(Oct),0) Oct, ISNULL(SUM(Nov),0) Nov, ISNULL(SUM(Dec),0) Dec, [year], --,Sort_Hours, --Sort_Days, A.RN Bla FROM ( SELECT *, RN=ROW_NUMBER() OVER(ORDER BY object_id) FROM sys.all_objects) A LEFT JOIN CTE B ON A.RN = CASE WHEN B.Bla > 191 THEN 192 ELSE B.Bla END WHERE A.RN BETWEEN 1 AND 192 GROUP BY A.RN,[year]

    Read the article

  • add ANOTHER primary key to a table which is UNIQUE

    - by gdubs
    so im having problems with adding another primary key to my table. i have 3 columns: 1. Account ID (Identity) 2. EmailID 3. Data field when i made the table i had this to make the Account ID and the Email ID unique PRIMARY KEY (AccountID, EmailID) i thought that would make my emailid unique, but then after i tried inserting another row with the same emailid it went through. so i thought i missed something out. now for my question: IF, i had to use alter, How do i alter the table/PK Constraint to modify the EmailID field and make it Unique IF i decided to drop the table and made a new one, how do i make those two primary keys uniqe? Thanks a bunch!!

    Read the article

  • What are the Pros & Cons of using SQL Azure for existing apps on dedicated servers

    - by Mark Redman
    We currently own our own servers, and rent a rack in a datacentre. Looking at the pricing, scalabilty and SLAs for Azure SQL, I am thinking that it might be viable to only use Azure SQL but continue to use our existing applications on our own servers in a datacentres. This will enable us to not worry about the database and its infrastructure so we can concentrate on building an application server farm with disk storeage for files etc. Our application is quite big and has various windows services and parts of it used unmanaged libraries that may not be feasible in the cloud, so probably coulnt have everything in the Azure cloud. The pros: Reduced Total Cost of ownership (no database servers, no sql server licenses) The Cons: I guess there would be overhead in the transfer of data between the Azure Cloud and our datacentre (ie cloud may be in US and datacentre is in the UK) but would this overhead be usable?

    Read the article

  • Simple ASP.NET Database Query

    - by Steven
    I have loaded my database with one table under "Data Connections" in the "Server Explorer" pane. What is the standard / best-practices way to handle a simple query in a VB ASPX page? My left <div> would be a set of form elements to filter rows, and when the button is clicked, the main <div> would show the columns I want for the rows returned. Note: Answers in C# are okay too, I'll just translate.

    Read the article

  • SQL Server Table Partitioning, what is happening behind the scenes?

    - by user404463
    I'm working with table partitioning on extremely large fact table in a warehouse. I have executed the script a few different ways. With and without non clustered indexes. With indexes it appears to dramatically expand the log file while without the non clustered indexes it appears to not expand the log file as much but takes more time to run due to the rebuilding of the indexes. What I am looking for is any links or information as to what is happening behind the scene specifically to the log file when you split a table partition.

    Read the article

  • Any way to separate unit tests from integration tests in VS2008?

    - by AngryHacker
    I have a project full of tests, unit and integration alike. Integration tests require that a pretty large database be present, so it's difficult to make it a part of the build process simply because of the time that it takes to re-initialize the database. Is there a way to somehow separate unit tests from integration tests and have the build server just run the unit tests? I see that there is an Ordered Unit test in VS2008, which allows you to pick and choose tests, but I can't make it just execute alone, without all the others. Is there a trick that I am missing? Or perhaps I could adorn the unit tests with an attribute? What are some of the approaches people are using? P.S. I know I could use mocking for integration tests (just to make them go faster) but then it wouldn't be a true integration test.

    Read the article

  • msvsmon is locking up my pdbs

    - by Sam Saffron
    During developement of my media center plugin (which has a few custom build steps to gac stuff and such) msvsmon has a rather annoying behaviour. First compilation usually goes well, but subsequent compilations complain about myplugin.pdb being locked Error 1 Unexpected error creating debug information file 'C:\Users\sam\source\myfile.PDB' -- 'C:\Users\sam\source\obj\Debug\myfile.pdb: The process cannot access the file because it is being used by another process. If I exit VS and nuke the object directory, I am able to compile again. Also, if I kill off msvsmon.exe I am able to compile again (but can not debug) Has anyone seen this error? Are there any workarounds? I already disabled live semantic errors, just in case.

    Read the article

  • Visual Studio Unit Tests : dll is not trusted

    - by Ian
    I'm struggling getting some unit tests running and wondering if anyone might have anything insightful. The setup is that we've got a bunch of referenced DLL's on a server and when I try and execute I get the old Test Run deployment issue: The location of the file or directory 'c:\source\ProjectName\bin\debug\3rdPartyLibrary.dll' is not trusted. I've tried the old caspol command: caspol -m -ag 1.2 -url file:\server\binaries* FullTrust Which seems to work for everything bar one DLL. I'm currently having to manually change the permissions everytime I do a build of the test project, which is a pain. Anyone have any suggestions? Running a Win7 64bit OS btw.

    Read the article

  • How to use Profile in asp.net?

    - by Phsika
    i try to learn asp.net Profile management. But i added below xml firstName,LastName and others. But i cannot write Profile. if i try to write Profile property. drow my editor Profile : Error 1 The name 'Profile' does not exist in the current context C:\Documents and Settings\ykaratoprak\Desktop\Security\WebApp_profile\WebApp_profile\Default.aspx.cs 18 13 WebApp_profile How can i do that? <authentication mode="Windows"/> <profile> <properties> <add name="FirstName"/> <add name="LastName"/> <add name="Age"/> <add name="City"/> </properties> </profile> protected void Button1_Click(object sender, System.EventArgs e) { Profile.FirstName = TextBox1.Text; Profile.LastName = TextBox2.Text; Profile.Age = TextBox3.Text; Profile.City = TextBox4.Text; Label1.Text = "Profile stored successfully!<br />" + "<br />First Name: " + Profile.FirstName + "<br />Last Name: " + Profile.LastName + "<br />Age: " + Profile.Age + "<br />City: " + Profile.City; }

    Read the article

  • MDX query for SSRS2008 parameter, to be used as filter in dataset

    - by adolf garlic - SAVE BBC6MUSIC
    What I want to do: The report parameter that the user can select from is a subset of the data available in a dimension. I have a query to retrieve this subset and it works fine. I am then trying to use the selection as a filter in a dataset. In the dataset properties, you can add a parameter based on the ones you have specified on the report, but when you look in the filters pane, there seems to be no way to select this. If I try and select something in there, it creates a whole new parameter and dataset. How can I get a dataset to consume and filter by a parameter that the user has selected?

    Read the article

  • Adding Class instance as a new Row in DataGridView (c#)

    - by Amit Shah
    Hi All, I have a class say [Serializable] public class Answer { [DisplayName("ID")] public string ID { get; set; } [DisplayName("Value")] public string Value { get; set; } } and I have a datagridview with bounded columns to the above class. instances of this class Answer are created dynamically as and when required. How do I update datagridview when each and every instance of class is created. is it possible to do something of this sort. dataGridView.Rows.Add(classInstance); Thanks in Advance, Amit

    Read the article

  • T-SQL out of order insert

    - by tearman
    Basically I have a User-Defined Table Type (for use as a table-valued variable) which I'm referencing in a stored procedure that effectively just calls two other stored procedures, then inserts those values into the table type. Id est INSERT INTO @tableValuedVariable (var1, var2, var3, var4, var5) EXEC [dbo].StoredProcedure1; INSERT INTO @tableValuedVariable (var1, var2, var5) EXEC [dbo].StoredProcedure2; You can probably already tell what I'm going to ask. Basically StoredProcedure2 only returns a few of the values the table is set to hold, and I'd like those other variables to just be null (as defined as default). Only SQL is complaining that I'm not specifying all the variables available to that table. The return datasets can be quite sizable so I'd like to avoid loops and such for obvious reasons. Thanks for any help.

    Read the article

  • Sql Server SMO connection timeout not working

    - by Uros Calakovic
    I have the following PowerShell code: function Get-SmoConnection { param ([string] $serverName = "", [int] $connectionTimeout = 0) if($serverName.Length -eq 0) { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection } else { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection($serverName) } if($connectionTimeout -ne 0) { $serverConnection.ConnectTimeout = $connectionTimeout } try { $serverConnection.Connect() $serverConnection } catch [system.Management.Automation.MethodInvocationException] { $null } } $connection = get-smoconnection "ServerName" 2 if($connection -ne $null) { Write-Host $connection.ServerInstance Write-Host $connection.ConnectTimeout } else { Write-Host "Connection could not be established" } It seems to work, except for the part that attempts to set the SMO connection timeout. If the connection is successful, I can verify that ServerConnection.ConnectTimeout is set to 2 (seconds), but when I supply a bogus name for the SQL Server instance, it still attempts to connect to it for ~ 15 seconds (which is I believe the default timeout value). Does anyone have experience with setting SMO connection timeout? Thank you in advance.

    Read the article

  • Execution Plan Optimization when where clause is removed then added back

    - by nmushov
    I have a stored procedure that uses a table valued function which executes in 9 seconds. If I alter the table valued function and remove the where clause, the stored procedure executes in 3 seconds. If I add the where clause back, the query still executes in 3 seconds. I took a look at the execution plans and it appears that after I remove the where clause, the execution plan includes parallelism and the scan count for 2 of my tables drops for 50000 and 65000 down to 5 and 3. After I add the where clause back, the optimized execution plan still runs unless I run DBCC FREEPROCCACHE. Questions 1. Why would SQL Server start using the optimized execution plan for both queries only when I first remove the where clause? Is there a way to force SQL Server to use this execution plan? Also, this is a paramaterized all-in-one query that uses the (Parameter is null or Parameter) in the where clause, which I believe is bad for performance. RETURNS TABLE AS RETURN ( SELECT TOP (@PageNumber * @PageSize) CASE WHEN @SortOrder = 'Expensive' THEN ROW_NUMBER() OVER (ORDER BY SellingPrice DESC) WHEN @SortOrder = 'Inexpensive' THEN ROW_NUMBER() OVER (ORDER BY SellingPrice ASC) WHEN @SortOrder = 'LowMiles' THEN ROW_NUMBER() OVER (ORDER BY Mileage ASC) WHEN @SortOrder = 'HighMiles' THEN ROW_NUMBER() OVER (ORDER BY Mileage DESC) WHEN @SortOrder = 'Closest' THEN ROW_NUMBER() OVER (ORDER BY P1.Distance ASC) WHEN @SortOrder = 'Newest' THEN ROW_NUMBER() OVER (ORDER BY [Year] DESC) WHEN @SortOrder = 'Oldest' THEN ROW_NUMBER() OVER (ORDER BY [Year] ASC) ELSE ROW_NUMBER() OVER (ORDER BY InventoryID ASC) END as rn, P1.InventoryID, P1.SellingPrice, P1.Distance, P1.Mileage, Count(*) OVER () RESULT_COUNT, dimCarStatus.[year] FROM (SELECT InventoryID, SellingPrice, Zip.Distance, Mileage, ColorKey, CarStatusKey, CarKey FROM facInventory JOIN @ZipCodes Zip ON Zip.DealerKey = facInventory.DealerKey) as P1 JOIN dimColor ON dimColor.ColorKey = P1.ColorKey JOIN dimCarStatus ON dimCarStatus.CarStatusKey = P1.CarStatusKey JOIN dimCar ON dimCar.CarKey = P1.CarKey WHERE (@ExteriorColor is NULL OR dimColor.ExteriorColor like @ExteriorColor) AND (@InteriorColor is NULL OR dimColor.InteriorColor like @InteriorColor) AND (@Condition is NULL OR dimCarStatus.Condition like @Condition) AND (@Year is NULL OR dimCarStatus.[Year] like @Year) AND (@Certified is NULL OR dimCarStatus.Certified like @Certified) AND (@Make is NULL OR dimCar.Make like @Make) AND (@ModelCategory is NULL OR dimCar.ModelCategory like @ModelCategory) AND (@Model is NULL OR dimCar.Model like @Model) AND (@Trim is NULL OR dimCar.Trim like @Trim) AND (@BodyType is NULL OR dimCar.BodyType like @BodyType) AND (@VehicleTypeCode is NULL OR dimCar.VehicleTypeCode like @VehicleTypeCode) AND (@MinPrice is NULL OR P1.SellingPrice >= @MinPrice) AND (@MaxPrice is NULL OR P1.SellingPrice < @MaxPrice) AND (@Mileage is NULL OR P1.Mileage < @Mileage) ORDER BY CASE WHEN @SortOrder = 'Expensive' THEN -SellingPrice WHEN @SortOrder = 'Inexpensive' THEN SellingPrice WHEN @SortOrder = 'LowMiles' THEN Mileage WHEN @SortOrder = 'HighMiles' THEN -Mileage WHEN @SortOrder = 'Closest' THEN P1.Distance WHEN @SortOrder = 'Newest' THEN -[YEAR] WHEN @SortOrder = 'Oldest' THEN [YEAR] ELSE InventoryID END )

    Read the article

  • Render an SSRS report with a Map as an image map without actually having a ReportViewer on the page

    - by Erica Merchant
    I have a report that has a Map with spatial data. Clicking an object on that map sends you to other pages on the site. I have tried a few different ways of displaying the report: If I put a ReportViewer on the actual page, the page sometimes takes 10+ seconds to load, but the report viewer creates a fully operable image map. If I create a ReportViewer in the code behind, I can use the Render method to to format the report as HTML4.0 and get the streamids from which I can extract the image (painfully). This is pretty fast (1-2 seconds), but only gives me an image, no image map. I can get a very similar functionality as the above example by using rendering extensions on the report URL to create an image and then set an image's source to this url. This is the fastest method, but still does not create an image map. So is there a way to create an image map from the report without having to use the ReportViewer? Or a way to substantially speed up the Report Viewer?

    Read the article

  • Can you add a folder structure to IIS7?

    - by tigermain
    I am in the process of setting up a new server which I share with 2 colleagues. Is it possible to get a folder structure into IIS7 at all (in the MMC) so we can keep our sites seperate? In the IIS7 management console I would like a set of folders foreach of my colleagues so that each of our websites are within their own sub folder.

    Read the article

  • .NET AxWebBrowser can't open page "Navigation to the webpage was canceled" error

    - by Erdnod
    Oh, I am stuck again on this quiet annoying problem. I am trying to do a little automated browsing using the axWebBrowser component and it worked before. Now all of a sudden I get the following message when I try to browse to a page. Navigation to the webpage was canceled What you can try: Refresh the page. I totally disabled my firewall, ran VS2008 in Administrator mode, I even installed IE8 all to no avail. By the way IE8 doesn't work either. I don't know if the problem is related but all I get when I put a URL into IE8 is this message. Internet Explorer cannot display the webpage I don't so much mind that, I just really need to get my visual studio axWebBrowser working.

    Read the article

< Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >