Search Results

Search found 43986 results on 1760 pages for 'sql session state'.

Page 376/1760 | < Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >

  • service broker message process order

    - by Blootac
    Everywhere I read says that messages handled by the service broker are processed in the order that they arrive, and yet if you create a table, message type, contract, service etc , and on activation have a stored proc that waits for 2 seconds and inserts the msg into a table, set the max queue readers to 5 or 10, and send 20 odd messages I can see in the table that they are inserted out of order even though when I insert them into the queue and look at the contents of the queue I can see that the messages are all in the right order. Is it due to the delay waitfor waiting for the nearest second and each thread having different subsecond times and then fighting for a lock or something? The reason i've got a delay in there is to simulate delays with joins etc Thanks demo code: --create the table and service broker CREATE TABLE test ( id int identity(1,1), contents varchar(100) ) CREATE MESSAGE TYPE test CREATE CONTRACT mycontract ( test sent by initiator ) GO CREATE PROCEDURE dostuff AS BEGIN DECLARE @msg varchar(100); RECEIVE TOP (1) @msg = message_body FROM myQueue IF @msg IS NOT NULL BEGIN WAITFOR DELAY '00:00:02' INSERT INTO test(contents)values(@msg) END END GO ALTER QUEUE myQueue WITH STATUS = ON, ACTIVATION ( STATUS = ON, PROCEDURE_NAME = dostuff, MAX_QUEUE_READERS = 10, EXECUTE AS SELF ) create service senderService on queue myQueue ( mycontract ) create service receiverService on queue myQueue ( mycontract ) GO --********************************************************** --now insert lots of messages to the queue DECLARE @dialog_handle uniqueidentifier BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>1</test>'); BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>2</test>') BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>3</test>') BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>4</test>') BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>5</test>') BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>6</test>') BEGIN DIALOG @dialog_handle FROM SERVICE senderService TO SERVICE 'receiverService' ON CONTRACT mycontract; SEND ON CONVERSATION @dialog_handle MESSAGE TYPE test ('<test>7</test>')

    Read the article

  • WebClient Lost my Session

    - by kamiar3001
    Hi folks I have a problem first of all look at my web service method : [WebMethod(), ScriptMethod(ResponseFormat = ResponseFormat.Json)] public string GetPageContent(string VirtualPath) { WebClient client = new WebClient(); string content=string.Empty; client.Encoding = System.Text.Encoding.UTF8; try { if (VirtualPath.IndexOf("______") > 0) content = client.DownloadString(HttpContext.Current.Request.UrlReferrer.AbsoluteUri.Replace("Main.aspx", VirtualPath.Replace("__", "."))); else content = client.DownloadString(HttpContext.Current.Request.UrlReferrer.AbsoluteUri.Replace("Main.aspx", VirtualPath)); } catch { content = "Not Found"; } return content; } As you can see my web service method read and buffer page from it's localhost it works and I use it to add some ajax functionality to my web site it works fine but my problem is Client.DownloadString(..) lost all session because my sessions are all null which are related to this page for more description. at page-load in the page which I want to load from my web service I set session : HttpContext.Current.Session[E_ShopData.Constants.SessionKey_ItemList] = result; but when I click a button in the page this session is null I mean it can't transfer session. How I can solve this problem ? my web service is handled by some jquery code like following : $.ajax({ type: "Post", url: "Services/NewE_ShopServices.asmx" + "/" + "GetPageContent", data: "{" + "VirtualPath" + ":" + mp + "}", contentType: "application/json; charset=utf-8", dataType: "json", complete: hideBlocker, success: LoadAjaxSucceeded, async: true, cache: false, error: AjaxFailed });

    Read the article

  • MS SQL: Primary file group is full

    - by aximili
    I have a very large table in my database and I am starting to get this error Could not allocate a new page for database 'mydatabase' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. How do you fix this error? I don't understand the suggestions there.

    Read the article

  • Multilingual best practices on SQL Server, EF and MVC combinations

    - by dengereli
    ASP.NET MVC, resource management is look like enough for application multlingual multiculture support. But I am wondering practices about data. User stories; User set culture as en-US and see all product items in English. User set culture as fr-FR and see all product items in French. User set culture as ru-RU and see all product items in Russian. User doesn't have right change culture settings and application never reach multilingual resources, it will use default language and culture.

    Read the article

  • VB working with SQL DB - end of row count, keeps looping

    - by Tramd
    I'm adding to a combo box an ID and a name that i'm pulling from a database. My problem is that for some reason my loop doesnt end once it reaches the end of the records in the database table. Here's my code: For intcount = 0 To dtOrders.Rows.Count - 1 cmbSearch.Items.Add(dtOrders.Rows(intcount)("EmployeeID").ToString & " " & dtOrders.Rows(intcount)("EmployeeLastName").ToString & ", " & dtOrders.Rows(intcount)("EmployeeFirstName").ToString) Next Shouldnt the .rows.count - 1 stop it once it reaches the last record? It loops 4 times through.

    Read the article

  • JSTL Session Lookup - Key Has Periods

    - by John W.
    Hello All, I am working with some legacy code, and at some point there is a key in the session that is something like session.setAttribute("com.org.something.Object",someObject); Now trying to access this in a jsp using jstl is a bit difficult becuase if I tried it like I normally would I would do: ${sessionScope.com.org.something.Object.someFieldGetter} As most of us can imagine it will fail because there is no com object in session scope. I also tried ${sessionScope.'com.org.something.Object'.someFieldGetter} And a parsing error was thrown. Does anyone know how to resolve this so that I can correctly get the object similar to session.getAttribute("com.org.something.Object") but through jstl? thank you.

    Read the article

  • handling Concurrency in SQL SERVER 2005

    - by sameer
    Hi, I have one question for you, if you can answer and refer resource it will be great help. I have a scenario where i need to create a appointment slot and a serial no for each slot memberwise. ex: Member Id |App Slot # 1|1 1|2 2|1 2|2 1|3 what im doing is take the Max slot number,increamenting it and insert it memberwise. but the problem is concurrent user can create a slot when i take the max slot after that if any other user insert the slot the value that im working with is no more valid, how to over come this problem Thanks & Regards, Sameer

    Read the article

  • generating sequence number

    - by stackoverflowuser
    Hi Based on following TableA Data -------- Dummy1 Dummy2 Dummy3 . . DummyN is there a way to generate sequence number while selecting rows from the table. something like select sequence() as ID,* from Data that will give ID Data --------- 1 Dummy1 2 Dummy2 3 Dummy3 .... N DummyN Thanks.

    Read the article

  • Clustered index - multi-part vs single-part index and effects of inserts/deletes

    - by Anssssss
    This question is about what happens with the reorganizing of data in a clustered index when an insert is done. I assume that it should be more expensive to do inserts on a table which has a clustered index than one that does not because reorganizing the data in a clustered index involves changing the physical layout of the data on the disk. I'm not sure how to phrase my question except through an example I came across at work. Assume there is a table (Junk) and there are two queries that are done on the table, the first query searches by Name and the second query searches by Name and Something. As I'm working on the database I discovered that the table has been created with two indexes, one to support each query, like so: --drop table Junk1 CREATE TABLE Junk1 ( Name char(5), Something char(5), WhoCares int ) CREATE CLUSTERED INDEX IX_Name ON Junk1 ( Name ) CREATE NONCLUSTERED INDEX IX_Name_Something ON Junk1 ( Name, Something ) Now when I looked at the two indexes, it seems that IX_Name is redundant since IX_Name_Something can be used by any query that desires to search by Name. So I would eliminate IX_Name and make IX_Name_Something the clustered index instead: --drop table Junk2 CREATE TABLE Junk2 ( Name char(5), Something char(5), WhoCares int ) CREATE CLUSTERED INDEX IX_Name_Something ON Junk2 ( Name, Something ) Someone suggested that the first indexing scheme should be kept since it would result in more efficient inserts/deletes (assume that there is no need to worry about updates for Name and Something). Would that make sense? I think the second indexing method would be better since it means one less index needs to be maintained. I would appreciate any insight into this specific example or directing me to more info on maintenance of clustered indexes.

    Read the article

  • Which can handle a huge surge of queries: SQL Server 2008 Fulltext or Lucene

    - by Luke101
    I am creating a widget that will be installed on several websites and blogs. The widget will analyse the remote webpage title and content, then it will return relevent articles/links on my website. The amount of traffic we expect will be very huge roughly 500K queries a day and up from there. I need the queries to be returned very quickly, so I need the candidate to be high performance, similar to google adsense. The remote title can be from 5 to 50 words and the description we will use no more then 3000 words. Which of these two do you think can handle the load.

    Read the article

  • SQL Access INSERT INTO Autonumber Field

    - by KrazyKash
    I'm trying to make a visual basic application which is connected to a Microsoft Access Database using OLEDB. Inside my database I have a user table with the following layout ID - Autonumber Username - Text Password - Text Email - Text To insert data into the table I use the following query INSERT INTO Users (Username, Password, Email) VALUES ('004606', 'Password', '[email protected]') However I seem to get an error with this statement and according to VB it's a syntax error. But then I tried to use the following query INSERT INTO Users (Username) Values ('004606') This query seemed to work absolutely fine... So the problem is I can insert into just one field but not all 3 (excluding the ID field because it's an autonumber). Any help would be appreciated, Thanks

    Read the article

  • LINQ to SQL: On load processing of lazy loaded associations

    - by Matt Holmes
    If I have an object that lazy loads an association with very large objects, is there a way I can do processing at the time the lazy load occurs? I thought I could use AssociateWith or LoadWith from DataLoadOptions, but there are very, very specific restrictions on what you can do in those. Basically I need to be notified when an EntitySet< decides it's time to load the associated object, so I can catch that event and do some processing on the loaded object. I don't want to simply walk through the EntitySet when I load the parent object, because that will force all the lazy loaded items to load (defeating the purpose of lazy loading entirely).

    Read the article

  • Checking for Magento login on external page

    - by LinuxGnut
    I'm hitting a wall here while trying to access items from Magento on an external page (same server, same domain, etc, etc). I want to see if the user is logged into Magento before showing them certain parts on the site. Keep in mind that this code exists outside of Magento. Mage::app("default"); Mage::getSingleton("core/session", array("name" = "frontend")); if (empty($session)) { $session = Mage::getSingleton("customer/session"); } if($session-isLoggedIn()) echo "hi"; $cart = Mage::helper('checkout/cart')-getCart()-getItemsCount(); echo $cart; $cart returns 0, where I definitely have products in my cart. isLoggedIn() also returns false. What am I doing wrong here? Is there an option in Magento that I need to turn on or off to be able to access this information outside of Magento?

    Read the article

  • Do not re-create session id .net cookieless sessions

    - by nLL
    Due to target audience I am using .net cookieless sessions in auto-detect mode and time to time I get visitors redirected with cookiless session url like domain.com/(S(jdhdghdghd))/default.aspx Problem is, if I call this url after session expired .net will re-create it. What I want to find out is a way to force .net to create another session id instead of using the one that came with url. Is it possible?

    Read the article

  • Count problem in SQL when I want results from diffrent tabels

    - by Nicklas
    ALTER PROCEDURE GetProducts @CategoryID INT AS SELECT COUNT(tblReview.GroupID) AS ReviewCount, COUNT(tblComment.GroupID) AS CommentCount, Product.GroupID, MAX(Product.ProductID) AS ProductID, AVG(Product.Price) AS Price, MAX (Product.Year) AS Year, MAX (Product.Name) AS Name, AVG(tblReview.Grade) AS Grade FROM tblReview, tblComment, Product WHERE (Product.CategoryID = @CategoryID) GROUP BY Product.GroupID HAVING COUNT(distinct Product.GroupID) = 1 This is what the tabels look like: **Product** |**tblReview** | **tblComment** ProductID | ReviewID | CommentID Name | Description | Description Year | GroupID | GroupID Price | Grade | GroupID GroupID is name_year of a Product, ex Nike_2010. One product can have diffrent sizes for exampel: ProductID | Name | Year | Price | Size | GroupID 1 | Nike | 2010 | 50 | 8 | Nike_2010 2 | Nike | 2010 | 50 | 9 | Nike_2010 3 | Nike | 2010 | 50 | 10 | Nike_2010 4 | Adidas| 2009 | 45 | 8 | Adidas_2009 5 | Adidas| 2009 | 45 | 9 | Adidas_2009 6 | Adidas| 2009 | 45 | 10 | Adidas_2009 I dont get the right count in my tblReview and tblComment. If I add a review to Nike size 8 and I add one review to Nike size 10 I want 2 count results when I list the products with diffrent GroupID. Now I get the same count on Reviews and Comment and both are wrong. I use a datalist to show all the products with diffrent/unique GroupID, I want it to be like this: ______________ | | | Name: Nike | | Year: 2010 | | (All Sizes) | | x Reviews | | x Comments | | x AVG Grade | |______________| All Reviewcounts, Commentcounts and the Average of all products with the same GroupID, the Average works great.

    Read the article

  • Query with many CASE statements - optimization

    - by Nemanja Vujacic
    Hi guys, I have one very dirty query that per sure can be optimized because there are so many CASE statements in it! SELECT (CASE pa.KplusTable_Id WHEN 1 THEN sp.sp_id WHEN 2 THEN fw.fw_id WHEN 3 THEN s.sw_Id WHEN 4 THEN id.ia_id END) as Deal_Id, max(CASE pa.KplusTable_Id WHEN 1 THEN sp.Trans_Id WHEN 2 THEN fw.Trans_Id WHEN 3 THEN s.Trans_Id WHEN 4 THEN id.Trans_Id END) as TransId_CurrentMax INTO #MaxRazlicitOdNull FROM #PotencijalniAktuelni pa LEFT JOIN kplus_sp sp (nolock) on sp.sp_id=pa.Deal_Id AND pa.KplusTable_Id=1 LEFT JOIN kplus_fw fw (nolock) on fw.fw_id=pa.Deal_Id AND pa.KplusTable_Id=2 LEFT JOIN dev_sw s (nolock) on s.sw_Id=pa.Deal_Id AND pa.KplusTable_Id=3 LEFT JOIN kplus_ia id (nolock) on id.ia_id=pa.Deal_Id AND pa.KplusTable_Id=4 WHERE isnull(CASE pa.KplusTable_Id WHEN 1 THEN sp.BROJ_TIKETA WHEN 2 THEN fw.BROJ_TIKETA WHEN 3 THEN s.tiket WHEN 4 THEN id.BROJ_TIKETA END, '')<>'' GROUP BY CASE pa.KplusTable_Id WHEN 1 THEN sp.sp_id WHEN 2 THEN fw.fw_id WHEN 3 THEN s.sw_Id WHEN 4 THEN id.ia_id END Because I have same condition couple times, do you have idea how to optimize query, make it simpler and better. All suggestions are welcome! TnX in advance! Nemanja

    Read the article

  • Good way to format decimal in SQL Server

    - by Brad
    We store a decimal(9,8) in our database. It can have any number of places after the decimal point (well, no more than 8). I am frustrated because I want to display it as human-readable text as part of a larger string created on the server. I want as many decimals to the right of the decimal point as are non-zero, for example: 0.05 0.12345 3.14159265 Are all good If I do CAST(d AS varchar(50)) I get formatting like: 0.05000000 0.12345000 3.14159265 I get similar output if I cast/convert to a float or other type before casting to a varchar. I know how to do a fixed number of decimal places, such as: 0.050 0.123 3.142 But that is not what I want. Yes, I know I can do this through complicated string manipulation (REPLACE, etc), there should be a good way to do it.

    Read the article

  • Need help with SQL Query

    - by StackOverflowNewbie
    Say I have 2 tables: Person - Id - Name PersonAttribute - Id - PersonId - Name - Value Further, let's say that each person had 2 attributes (say, gender and age). A sample record would be like this: Person->Id = 1 Person->Name = 'John Doe' PersonAttribute->Id = 1 PersonAttribute->PersonId = 1 PersonAttribute->Name = 'Gender' PersonAttribute->Value = 'Male' PersonAttribute->Id = 2 PersonAttribute->PersonId = 1 PersonAttribute->Name = 'Age' PersonAttribute->Value = '30' Question: how do I query this such that I get a result like this: 'John Doe', 'Male', '30'

    Read the article

  • Return if remote stored procedure fails

    - by njk
    I am in the process of creating a stored procedure. This stored procedure runs local as well as external stored procedures. For simplicity, I'll call the local server [LOCAL] and the remote server [REMOTE]. USE [LOCAL] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[monthlyRollUp] AS SET NOCOUNT, XACT_ABORT ON BEGIN TRY EXEC [REOMTE].[DB].[table].[sp] --This transaction should only begin if the remote procedure does not fail BEGIN TRAN EXEC [LOCAL].[DB].[table].[sp1] COMMIT BEGIN TRAN EXEC [LOCAL].[DB].[table].[sp2] COMMIT BEGIN TRAN EXEC [LOCAL].[DB].[table].[sp3] COMMIT BEGIN TRAN EXEC [LOCAL].[DB].[table].[sp4] COMMIT END TRY BEGIN CATCH -- Insert error into log table INSERT INTO [dbo].[log_table] (stamp, errorNumber, errorSeverity, errorState, errorProcedure, errorLine, errorMessage) SELECT GETDATE(), ERROR_NUMBER(), ERROR_SEVERITY(), ERROR_STATE(), ERROR_PROCEDURE(), ERROR_LINE(), ERROR_MESSAGE() END CATCH GO When using a transaction on the remote procedure, it throws this error: OLE DB provider ... returned message "The partner transaction manager has disabled its support for remote/network transactions.". I get that I'm unable to run a transaction locally for a remote procedure. How can I ensure that the this procedure will exit and rollback if any part of the procedure fails?

    Read the article

  • Replication - User defined table type not propogating to subscriber

    - by Aamod Thakur
    I created a User defined table type named tvp_Shipment with two columns (id and name) . generated a snapshot and the User defined table type was properly propogated to all the subscribers. I was using this tvp in a stored procedure and everything worked fine. Then I wanted to add one more column created_date to this table valued parameter.I dropped the stored procedure (from replication too) and also i dropped and recreated the User defined table type with 3 columns and then recreated the stored procedure and enabled it for publication When i generate a new snapshot, the changes in user defined table type are not propogated to the subscriber. The newly added column was not added to the subscription. the Error messages: The schema script 'usp_InsertAirSa95c0e23_218.sch' could not be propagated to the subscriber. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147201001) Get help: http://help/MSSQL_REPL-2147201001 Invalid column name 'created_date'. (Source: MSSQLServer, Error number: 207) Get help: http://help/207

    Read the article

  • SQL Server 2008 - Difference between time(0)

    - by lugeno
    I've a table with working_hours time(0), lunch_hours time(0) What I have to do is the following: If lunch_hours is greater that one hour, I have to calculate the offset Example: lounch_hour = 01:30:00 = offset = 00:30:00 Once done I've to subtract the offset from the working_hours value Example: offset = 00:30:00, working_hours = 07:30:00 = working_hours = 07:00:00 The result must be in time(0) format (hh:mm:ss) I've tried several solutions but still not working. Used DATEDIFF probably didn't used in correct way. Thanks for any help Bye!

    Read the article

< Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >