Search Results

Search found 79588 results on 3184 pages for 'sql data storage'.

Page 400/3184 | < Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >

  • SQL VIEW Basics

    SQL Views are essential for the database developer. However, it is common to see them misued, or neglected. Joe Celko tackles an introduction to the subject, but there is something about the topic that makes it likely that even the experienced developer will find out something new from reading it. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Avoiding Parameter Sniffing in SQL Server

    Parameter sniffing is when SQL Server compiles a stored procedure’s execution plan with the first parameter that has been used and then uses this plan for subsequent executions regardless of the parameters. Get your SQL Server database under version control now!Version control is standard for applications, but databases haven’t caught up. So how can you bring database development up to speed? Why should you start? Find out…

    Read the article

  • Free eBook: SQL Server Backup and Restore

    You can download a free eBook from SQLServerCentral and Red Gate software on the most important task a SQL Server DBA or developer needs to understand. NEW! Deployment Manager Early Access ReleaseDeploy SQL Server changes and .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try the Early Access Release to get a 20% discount on Version 1. Download the Early Access Release.

    Read the article

  • SQL View: Beyond the Basics

    Joe Celko delves into the main uses of views, explains how the WITH CHECK OPTION works, and demonstrates how the INSTEAD OF trigger can be used in those cases where views cannot be updatable. What are your servers really trying to tell you? Find out with new SQL Monitor 3.0, an easy-to-use tool built for no-nonsense database professionals.For effortless insights into SQL Server, download a free trial today.

    Read the article

  • Handling SQL Server Errors

    This article covers the basics of TRY CATCH error handling in T-SQL introduced in SQL Server 2005. It includes the usage of common functions to return information about the error and using the TRY CATCH block in stored procedures and transactions.

    Read the article

  • Practically Cloudy: SQL Server Disaster Recovery to Microsoft Azure - Backups

    In the first in a series on the practicalities of using the Microsoft Azure Platform for the SQL Server professional, Buck Woody shows that, whatever your version of SQL Server, there is a way of storing offsite backups in the cloud. Can 41,000 DBAs really be wrong? Join 41,000 other DBAs who are following the new series from the DBA Team: the 5 Worst Days in a DBA’s Life. Part 3, As Corrupt As It Gets, is out now – read it here.

    Read the article

  • Free eBook: Understanding SQL Server Concurrency

    When you can’t get to your data because another application has it locked, a thorough knowledge of SQL Server concurrency will give you the confidence to decide what to do. Get your SQL Server database under version control now!Version control is standard for applications, but databases haven’t caught up. So how can you bring database development up to speed? Why should you start? Find out…

    Read the article

  • Complex SQL query help on aggregating values for nested subquery

    - by François Beausoleil
    Hi! I have people, companies, employees, events and event kinds. I'm making a report/followup sheet where people, companies and employees are the rows, and the columns are event kinds. Event kinds are simple values describing: "Promised Donation", "Received Donation", "Phoned", "Followed up" and such. Event kinds are ordered: CREATE TABLE event_kinds ( id, name, position); Events hold the actual reference to the event: CREATE TABLE events ( id, person_id, company_id, referrer_id, event_kind_id, created_at); referrer_id is another reference to people. It is the person which sent the information/tip along, and is an optional field, although I sometimes want to filter on an event_kind that has a specific referrer, while I don't for other event kinds. Notice I don't have an employee ID reference. The reference exists, but is implied. I have application code to validate that person_id and company_id really reference an employee record. The other tables are pretty basic: CREATE TABLE people ( id, name); CREATE TABLE companies ( id, name); CREATE TABLE employees ( id, person_id, company_id); I'm trying to achieve the following report: Referrer Phoned Promised Donated Francois Feb 16th Feb 20th Mar 1st Apple (Steve Jobs) Steve Ballmer Mar 3rd IBM Bill Gates Mar 7th The first row is a people record, the 2nd is an employee, and the 3rd is a company. If I asked for referrer Bill Gates for Phoned event kinds, I'd only see the 3rd row, while asking for Steve and Phoned would return no rows. Right now, I do 3 queries, one for companies, one for people and a last one for employees. I want the event kind columns to be ordered, but I do that in application code and show it properly there. Here's where I'm at so far: SELECT companies.id, companies.name, (SELECT events.id FROM events WHERE events.referrer_id = 1470 AND events.company_id = companies.id AND events.person_id IS NULL AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9, (SELECT events.id FROM events WHERE events.company_id = companies.id AND events.person_id IS NULL AND events.event_kind_id = 10 ORDER BY created_at DESC LIMIT 1) event_kind_10, (SELECT events.created_at FROM events WHERE events.referrer_id = 1470 AND events.company_id = companies.id AND events.person_id IS NULL AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9_order FROM "companies" SELECT people.id, people.name, (SELECT events.id FROM events WHERE events.referrer_id = 1470 AND events.company_id IS NULL AND events.person_id = people.id AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9, (SELECT events.id FROM events WHERE events.company_id IS NULL AND events.person_id = people.id AND events.event_kind_id = 10 ORDER BY created_at DESC LIMIT 1) event_kind_10, (SELECT events.created_at FROM events WHERE events.referrer_id = 1470 AND events.company_id IS NULL AND events.person_id = people.id AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9_order FROM "people" SELECT employees.id, employees.company_id, employees.person_id, (SELECT events.id FROM events WHERE events.referrer_id = 1470 AND events.company_id = employees.company_id AND events.person_id = employees.person_id AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9, (SELECT events.id FROM events WHERE events.company_id = employees.company_id AND events.person_id = employees.person_id AND events.event_kind_id = 10 ORDER BY created_at DESC LIMIT 1) event_kind_10, (SELECT events.created_at FROM events WHERE events.referrer_id = 1470 AND events.company_id = employees.company_id AND events.person_id = employees.person_id AND events.event_kind_id = 9 ORDER BY created_at DESC LIMIT 1) event_kind_9_order FROM "employees" I rather suspect I'm doing this wrong. There should be an "easier" way to do it. One other filter criteria would be to filter on people/company names: WHERE LOWER(companies.name) LIKE '%apple%'. Note that I'm ordering by the dates of event_kind_9 here, and a secondary sort is by person/company name. To summarize: I want to paginate the result set, find the latest event for each cell, order the result set by the date of the latest event, and by company/person name, filter by referrer in some event kinds, but not others. For reference, I'm using PostgreSQL, from Ruby, ActiveRecord/Rails. The solution is pure SQL though.

    Read the article

  • Access 2007 VBA & SQL - Update a Subform pointed at a dynamically created query

    - by Lucretius
    Abstract: I'm using VB to recreate a query each time a user selects one of 3 options from a drop down menu, which appends the WHERE clause If they've selected anything from the combo boxes. I then am attempting to get the information displayed on the form to refresh thereby filtering what is displayed in the table based on user input. 1) Dynamically created query using VB. Private Sub BuildQuery() ' This sub routine will redefine the subQryAllJobsQuery based on input from ' the user on the Management tab. Dim strQryName As String Dim strSql As String ' Main SQL SELECT statement Dim strWhere As String ' Optional WHERE clause Dim qryDef As DAO.QueryDef Dim dbs As DAO.Database strQryName = "qryAllOpenJobs" strSql = "SELECT * FROM tblOpenJobs" Set dbs = CurrentDb ' In case the query already exists we should deleted it ' so that we can rebuild it. The ObjectExists() function ' calls a public function in GlobalVariables module. If ObjectExists("Query", strQryName) Then DoCmd.DeleteObject acQuery, strQryName End If ' Check to see if anything was selected from the Shift ' Drop down menu. If so, begin the where clause. If Not IsNull(Me.cboShift.Value) Then strWhere = "WHERE tblOpenJobs.[Shift] = '" & Me.cboShift.Value & "'" End If ' Check to see if anything was selected from the Department ' drop down menu. If so, append or begin the where clause. If Not IsNull(Me.cboDepartment.Value) Then If IsNull(strWhere) Then strWhere = strWhere & " AND tblOpenJobs.[Department] = '" & Me.cboDepartment.Value & "'" Else strWhere = "WHERE tblOpenJobs.[Department] = '" & Me.cboDepartment.Value & "'" End If End If ' Check to see if anything was selected from the Date ' field. If so, append or begin the Where clause. If Not IsNull(Me.txtDate.Value) Then If Not IsNull(strWhere) Then strWhere = strWhere & " AND tblOpenJobs.[Date] = '" & Me.txtDate.Value & "'" Else strWhere = "WHERE tblOpenJobs.[Date] = '" & Me.txtDate.Value & "'" End If End If ' Concatenate the Select and the Where clause together ' unless all three parameters are null, in which case return ' just the plain select statement. If IsNull(Me.cboShift.Value) And IsNull(Me.cboDepartment.Value) And IsNull(Me.txtDate.Value) Then Set qryDef = dbs.CreateQueryDef(strQryName, strSql) Else strSql = strSql & " " & strWhere Set qryDef = dbs.CreateQueryDef(strQryName, strSql) End If End Sub 2) Main Form where the user selects items from combo boxes. picture of the main form and sub form http://i48.tinypic.com/25pjw2a.png 3) Subform pointed at the query created in step 1. Chain of events: 1) User selects item from drop down list on the main form. 2) Old query is deleted, new query is generated (same name). 3) Subform pointed at query does not update, but if you open the query by itself the correct results are displayed. Name of the Query: qryAllOpenJobs name of the subform: subQryAllOpenJobs Also, the Row Source of subQryAllOpenJobs = qryAllOpenJobs Name of the main form: frmManagement

    Read the article

  • loading xml into SQL Server 2008 using sqlbulkload component

    - by mohamed
    "Error: Schema: relationship expected on 'headerRecord'." I get the above error while load xml file to SQL Server 2008 using SQLXMLBulkLoad4 Component , the xml file contains Call Detail records, I have generated schema file from xml file using both , Dataset and XSD.exe tool, but the error remains same., if there is another way to imports xml file with multiple tables that have relationship in each file into SQL Server 2008? . Here the xml file: <CallEventDataFile> <headerRecord> <productionDateTime>0912021247482B0300</productionDateTime> <recordingEntity>00</recordingEntity> <extensions/> </headerRecord> <callEventRecords> <mtSMSRecord> <recordType>7</recordType> <serviceCentre>91521230</serviceCentre> <servedIMSI>36570000031728F2</servedIMSI> <servedIMEI>53886000707896F0</servedIMEI> <servedMSISDN>915212454503F2</servedMSISDN> <msClassmark>3319A1</msClassmark> <recordingEntity>915212110100</recordingEntity> <location> <locationAreaCode>0006</locationAreaCode> <cellIdentifier>0C6E</cellIdentifier> </location> <deliveryTime>0912021535412B0300</deliveryTime> <systemType> <gERAN/> </systemType> <basicService> <teleservice>21</teleservice> </basicService> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <chargedParty> <calledParty/> </chargedParty> <orgRNCorBSCId>8E1A</orgRNCorBSCId> <orgMSCId>921A</orgMSCId> <globalAreaID>36F70500060C6E</globalAreaID> <subscriberCategory>0A</subscriberCategory> <firstmccmnc>36F705</firstmccmnc> <smsUserDataType>FF</smsUserDataType> <origination>8191F2</origination> <callReference>1605EB2FE1</callReference> </mtSMSRecord> <moSMSRecord> <recordType>6</recordType> <servedIMSI>36570000238707F9</servedIMSI> <servedIMEI>53928320195925F0</servedIMEI> <servedMSISDN>915212159430F2</servedMSISDN> <msClassmark>3319A2</msClassmark> <serviceCentre>91521230</serviceCentre> <recordingEntity>915212110100</recordingEntity> <location> <locationAreaCode>001B</locationAreaCode> <cellIdentifier>6983</cellIdentifier> </location> <messageReference>01</messageReference> <originationTime>0912021535412B0300</originationTime> <destinationNumber>8111F1</destinationNumber> <systemType> <gERAN/> </systemType> <basicService> <teleservice>22</teleservice> </basicService> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <chargedParty> <callingParty/> </chargedParty> <orgRNCorBSCId>8F1A</orgRNCorBSCId> <orgMSCId>921A</orgMSCId> <globalAreaID>36F705001B6983</globalAreaID> <subscriberCategory>0A</subscriberCategory> <firstmccmnc>36F705</firstmccmnc> <smsUserDataType>FF</smsUserDataType> <callReference>1701BED4FF</callReference> </moSMSRecord> <ssActionRecord> <recordType>10</recordType> <servedIMSI>36570000636448F8</servedIMSI> <servedIMEI>53246030714961F0</servedIMEI> <servedMSISDN>915212056928F8</servedMSISDN> <msClassmark>3018A1</msClassmark> <recordingEntity>915212110100</recordingEntity> <location> <locationAreaCode>000C</locationAreaCode> <cellIdentifier>05A5</cellIdentifier> </location> <supplService>FF</supplService> <ssAction> <ussdInvocation/> </ssAction> <ssActionTime>0912021535412B0300</ssActionTime> <ssParameters> <unstructuredData>AA5C2E3702</unstructuredData> </ssParameters> <callReference>1701BED500</callReference> <systemType> <gERAN/> </systemType> <ussdCodingScheme>0F</ussdCodingScheme> <ussdString> <UssdString>AA5C2E3702</UssdString> </ussdString> <ussdRequestCounter>1</ussdRequestCounter> <additionalChgInfo> <chargeIndicator>1</chargeIndicator> </additionalChgInfo> <orgRNCorBSCId>8E1A</orgRNCorBSCId> <orgMSCId>921A</orgMSCId> <globalAreaID>36F705000C05A5</globalAreaID> <subscriberCategory>0A</subscriberCategory> <firstmccmnc>36F705</firstmccmnc> </ssActionRecord> <moCallRecord> <recordType>0</recordType> <servedIMSI>36570000807501F5</servedIMSI> <servedIMEI>53246030713955F0</servedIMEI> <servedMSISDN>915212157901F0</servedMSISDN> <callingNumber>A151911700</callingNumber> <calledNumber>8151677589</calledNumber> <roamingNumber>A111113850</roamingNumber> <recordingEntity>915212110100</recordingEntity> <mscIncomingROUTE> <rOUTEName>HWBSC2</rOUTEName> </mscIncomingROUTE> <mscOutgoingROUTE> <rOUTEName>HWBSC2</rOUTEName> </mscOutgoingROUTE> <location> <locationAreaCode>0006</locationAreaCode> <cellIdentifier>0C2F</cellIdentifier> </location> <basicService> <teleservice>11</teleservice> </basicService> <msClassmark>3319A1</msClassmark> <answerTime>0912021535382B0300</answerTime> <releaseTime>0912021535422B0300</releaseTime> <callDuration>4</callDuration> <radioChanRequested> <dualFullRatePreferred/> </radioChanRequested> <radioChanUsed> <halfRate/> </radioChanUsed> <causeForTerm>0</causeForTerm> <diagnostics> <gsm0408Cause>144</gsm0408Cause> </diagnostics> <callReference>1701BED501</callReference> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <gsm-SCFAddress>915212110130</gsm-SCFAddress> <serviceKey>1</serviceKey> <networkCallReference>171D555132</networkCallReference> <mSCAddress>915212110100</mSCAddress> <speechVersionSupported>25</speechVersionSupported> <speechVersionUsed>21</speechVersionUsed> <numberOfDPEncountered>3</numberOfDPEncountered> <levelOfCAMELService>01</levelOfCAMELService> <freeFormatData>800130</freeFormatData> <systemType> <gERAN/> </systemType> <classmark3>C000</classmark3> <chargedParty> <callingParty/> </chargedParty> <mscOutgoingCircuit>1051</mscOutgoingCircuit> <orgRNCorBSCId>8E1A</orgRNCorBSCId> <orgMSCId>921A</orgMSCId> <calledIMSI>36570000635618F8</calledIMSI> <globalAreaID>36F70500060C2F</globalAreaID> <subscriberCategory>0A</subscriberCategory> <firstmccmnc>36F705</firstmccmnc> <lastmccmnc>36F705</lastmccmnc> </moCallRecord> <mtCallRecord> <recordType>1</recordType> <servedIMSI>36570000635618F8</servedIMSI> <servedIMEI>53464010474309F0</servedIMEI> <servedMSISDN>915212755697F8</servedMSISDN> <callingNumber>A151911700</callingNumber> <recordingEntity>915212110100</recordingEntity> <mscIncomingROUTE> <rOUTEName>HWBSC2</rOUTEName> </mscIncomingROUTE> <mscOutgoingROUTE> <rOUTEName>HWBSC2</rOUTEName> </mscOutgoingROUTE> <location> <locationAreaCode>0006</locationAreaCode> <cellIdentifier>0C2D</cellIdentifier> </location> <basicService> <teleservice>11</teleservice> </basicService> <supplServicesUsed> <SuppServiceUsedid> <ssCode>11</ssCode> <ssTime>0912021535382B0300</ssTime> </SuppServiceUsedid> </supplServicesUsed> <msClassmark>331981</msClassmark> <answerTime>0912021535382B0300</answerTime> <releaseTime>0912021535422B0300</releaseTime> <callDuration>4</callDuration> <radioChanRequested> <dualFullRatePreferred/> </radioChanRequested> <radioChanUsed> <halfRate/> </radioChanUsed> <causeForTerm>0</causeForTerm> <diagnostics> <gsm0408Cause>144</gsm0408Cause> </diagnostics> <callReference>1701BED502</callReference> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <networkCallReference>171D555132</networkCallReference> <mSCAddress>915212110100</mSCAddress> <speechVersionSupported>25</speechVersionSupported> <speechVersionUsed>21</speechVersionUsed> <systemType> <gERAN/> </systemType> <classmark3>C000</classmark3> <chargedParty> <calledParty/> </chargedParty> <roamingNumber>A111113850</roamingNumber> <mscIncomingCircuit>9119</mscIncomingCircuit> <orgRNCorBSCId>8E1A</orgRNCorBSCId> <orgMSCId>921A</orgMSCId> <globalAreaID>36F70500060C2D</globalAreaID> <subscriberCategory>0A</subscriberCategory> <firstmccmnc>36F705</firstmccmnc> <lastmccmnc>36F705</lastmccmnc> </mtCallRecord> <incGatewayRecord> <recordType>3</recordType> <callingNumber>A17005991565</callingNumber> <calledNumber>A1853643F7</calledNumber> <recordingEntity>915212110100</recordingEntity> <mscIncomingROUTE> <rOUTEName>ZPSTN</rOUTEName> </mscIncomingROUTE> <mscOutgoingROUTE> <rOUTEName>ZTEBSC3</rOUTEName> </mscOutgoingROUTE> <answerTime>0912021535302B0300</answerTime> <releaseTime>0912021535422B0300</releaseTime> <callDuration>12</callDuration> <causeForTerm>0</causeForTerm> <diagnostics> <gsm0408Cause>144</gsm0408Cause> </diagnostics> <callReference>2203AFBF84</callReference> <basicService> <teleservice>11</teleservice> </basicService> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <roamingNumber>A111111980</roamingNumber> <mscIncomingCircuit>934</mscIncomingCircuit> <orgMSCId>921A</orgMSCId> <mscIncomingRouteAttribute> <isup/> </mscIncomingRouteAttribute> <networkCallReference>22432B5132</networkCallReference> </incGatewayRecord> <outGatewayRecord> <recordType>4</recordType> <callingNumber>A151012431</callingNumber> <calledNumber>817026936873</calledNumber> <recordingEntity>915212110100</recordingEntity> <mscIncomingROUTE> <rOUTEName>HWBSC</rOUTEName> </mscIncomingROUTE> <mscOutgoingROUTE> <rOUTEName>ZPSTN</rOUTEName> </mscOutgoingROUTE> <answerTime>0912021535192B0300</answerTime> <releaseTime>0912021535432B0300</releaseTime> <callDuration>24</callDuration> <causeForTerm>0</causeForTerm> <diagnostics> <gsm0408Cause>144</gsm0408Cause> </diagnostics> <callReference>2303B19880</callReference> <basicService> <teleservice>11</teleservice> </basicService> <additionalChgInfo> <chargeIndicator>2</chargeIndicator> </additionalChgInfo> <mscOutgoingCircuit>398</mscOutgoingCircuit> <orgMSCId>921A</orgMSCId> <mscOutgoingRouteAttribute> <isup/> </mscOutgoingRouteAttribute> <networkCallReference>238BE55132</networkCallReference> </outGatewayRecord> </callEventRecords> <trailerRecord> <productionDateTime>0912021247512B0300</productionDateTime> <recordingEntity>00</recordingEntity> <firstCallDateTime>000000000000000000</firstCallDateTime> <lastCallDateTime>000000000000000000</lastCallDateTime> <noOfRecords>521</noOfRecords> <extensions/> </trailerRecord> <extensions/> </CallEventDataFile> Schema File generated by Dataset: <?xml version="1.0" standalone="yes"?> <xs:schema id="NewDataSet" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata"> <xs:element name="location"> <xs:complexType> <xs:sequence> <xs:element name="locationAreaCode" type="xs:string" minOccurs="0" /> <xs:element name="cellIdentifier" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="systemType"> <xs:complexType> <xs:sequence> <xs:element name="gERAN" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="basicService"> <xs:complexType> <xs:sequence> <xs:element name="teleservice" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="additionalChgInfo"> <xs:complexType> <xs:sequence> <xs:element name="chargeIndicator" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="chargedParty"> <xs:complexType> <xs:sequence> <xs:element name="calledParty" type="xs:string" minOccurs="0" /> <xs:element name="callingParty" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="mscIncomingROUTE"> <xs:complexType> <xs:sequence> <xs:element name="rOUTEName" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="mscOutgoingROUTE"> <xs:complexType> <xs:sequence> <xs:element name="rOUTEName" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="radioChanRequested"> <xs:complexType> <xs:sequence> <xs:element name="dualFullRatePreferred" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="radioChanUsed"> <xs:complexType> <xs:sequence> <xs:element name="halfRate" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="diagnostics"> <xs:complexType> <xs:sequence> <xs:element name="gsm0408Cause" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="CallEventDataFile"> <xs:complexType> <xs:sequence> <xs:element name="extensions" type="xs:string" minOccurs="0" /> <xs:element name="headerRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="productionDateTime" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="extensions" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="callEventRecords" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="mtSMSRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="recordType" type="xs:string" minOccurs="0" /> <xs:element name="serviceCentre" type="xs:string" minOccurs="0" /> <xs:element name="servedIMSI" type="xs:string" minOccurs="0" /> <xs:element name="servedIMEI" type="xs:string" minOccurs="0" /> <xs:element name="servedMSISDN" type="xs:string" minOccurs="0" /> <xs:element name="msClassmark" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="deliveryTime" type="xs:string" minOccurs="0" /> <xs:element name="orgRNCorBSCId" type="xs:string" minOccurs="0" /> <xs:element name="orgMSCId" type="xs:string" minOccurs="0" /> <xs:element name="globalAreaID" type="xs:string" minOccurs="0" /> <xs:element name="subscriberCategory" type="xs:string" minOccurs="0" /> <xs:element name="firstmccmnc" type="xs:string" minOccurs="0" /> <xs:element name="smsUserDataType" type="xs:string" minOccurs="0" /> <xs:element name="origination" type="xs:string" minOccurs="0" /> <xs:element name="callReference" type="xs:string" minOccurs="0" /> <xs:element ref="location" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="systemType" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="basicService" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="additionalChgInfo" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="chargedParty" minOccurs="0" maxOccurs="unbounded" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="moSMSRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="recordType" type="xs:string" minOccurs="0" /> <xs:element name="servedIMSI" type="xs:string" minOccurs="0" /> <xs:element name="servedIMEI" type="xs:string" minOccurs="0" /> <xs:element name="servedMSISDN" type="xs:string" minOccurs="0" /> <xs:element name="msClassmark" type="xs:string" minOccurs="0" /> <xs:element name="serviceCentre" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="messageReference" type="xs:string" minOccurs="0" /> <xs:element name="originationTime" type="xs:string" minOccurs="0" /> <xs:element name="destinationNumber" type="xs:string" minOccurs="0" /> <xs:element name="orgRNCorBSCId" type="xs:string" minOccurs="0" /> <xs:element name="orgMSCId" type="xs:string" minOccurs="0" /> <xs:element name="globalAreaID" type="xs:string" minOccurs="0" /> <xs:element name="subscriberCategory" type="xs:string" minOccurs="0" /> <xs:element name="firstmccmnc" type="xs:string" minOccurs="0" /> <xs:element name="smsUserDataType" type="xs:string" minOccurs="0" /> <xs:element name="callReference" type="xs:string" minOccurs="0" /> <xs:element ref="location" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="systemType" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="basicService" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="additionalChgInfo" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="chargedParty" minOccurs="0" maxOccurs="unbounded" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="ssActionRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="recordType" type="xs:string" minOccurs="0" /> <xs:element name="servedIMSI" type="xs:string" minOccurs="0" /> <xs:element name="servedIMEI" type="xs:string" minOccurs="0" /> <xs:element name="servedMSISDN" type="xs:string" minOccurs="0" /> <xs:element name="msClassmark" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="supplService" type="xs:string" minOccurs="0" /> <xs:element name="ssActionTime" type="xs:string" minOccurs="0" /> <xs:element name="callReference" type="xs:string" minOccurs="0" /> <xs:element name="ussdCodingScheme" type="xs:string" minOccurs="0" /> <xs:element name="ussdRequestCounter" type="xs:string" minOccurs="0" /> <xs:element name="orgRNCorBSCId" type="xs:string" minOccurs="0" /> <xs:element name="orgMSCId" type="xs:string" minOccurs="0" /> <xs:element name="globalAreaID" type="xs:string" minOccurs="0" /> <xs:element name="subscriberCategory" type="xs:string" minOccurs="0" /> <xs:element name="firstmccmnc" type="xs:string" minOccurs="0" /> <xs:element ref="location" minOccurs="0" maxOccurs="unbounded" /> <xs:element name="ssAction" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="ussdInvocation" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="ssParameters" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="unstructuredData" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element ref="systemType" minOccurs="0" maxOccurs="unbounded" /> <xs:element name="ussdString" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="UssdString" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element ref="additionalChgInfo" minOccurs="0" maxOccurs="unbounded" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="moCallRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="recordType" type="xs:string" minOccurs="0" /> <xs:element name="servedIMSI" type="xs:string" minOccurs="0" /> <xs:element name="servedIMEI" type="xs:string" minOccurs="0" /> <xs:element name="servedMSISDN" type="xs:string" minOccurs="0" /> <xs:element name="callingNumber" type="xs:string" minOccurs="0" /> <xs:element name="calledNumber" type="xs:string" minOccurs="0" /> <xs:element name="roamingNumber" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="msClassmark" type="xs:string" minOccurs="0" /> <xs:element name="answerTime" type="xs:string" minOccurs="0" /> <xs:element name="releaseTime" type="xs:string" minOccurs="0" /> <xs:element name="callDuration" type="xs:string" minOccurs="0" /> <xs:element name="causeForTerm" type="xs:string" minOccurs="0" /> <xs:element name="callReference" type="xs:string" minOccurs="0" /> <xs:element name="gsm-SCFAddress" type="xs:string" minOccurs="0" /> <xs:element name="serviceKey" type="xs:string" minOccurs="0" /> <xs:element name="networkCallReference" type="xs:string" minOccurs="0" /> <xs:element name="mSCAddress" type="xs:string" minOccurs="0" /> <xs:element name="speechVersionSupported" type="xs:string" minOccurs="0" /> <xs:element name="speechVersionUsed" type="xs:string" minOccurs="0" /> <xs:element name="numberOfDPEncountered" type="xs:string" minOccurs="0" /> <xs:element name="levelOfCAMELService" type="xs:string" minOccurs="0" /> <xs:element name="freeFormatData" type="xs:string" minOccurs="0" /> <xs:element name="classmark3" type="xs:string" minOccurs="0" /> <xs:element name="mscOutgoingCircuit" type="xs:string" minOccurs="0" /> <xs:element name="orgRNCorBSCId" type="xs:string" minOccurs="0" /> <xs:element name="orgMSCId" type="xs:string" minOccurs="0" /> <xs:element name="calledIMSI" type="xs:string" minOccurs="0" /> <xs:element name="globalAreaID" type="xs:string" minOccurs="0" /> <xs:element name="subscriberCategory" type="xs:string" minOccurs="0" /> <xs:element name="firstmccmnc" type="xs:string" minOccurs="0" /> <xs:element name="lastmccmnc" type="xs:string" minOccurs="0" /> <xs:element ref="mscIncomingROUTE" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="mscOutgoingROUTE" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="location" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="basicService" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="radioChanRequested" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="radioChanUsed" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="diagnostics" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="additionalChgInfo" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="systemType" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="chargedParty" minOccurs="0" maxOccurs="unbounded" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="mtCallRecord" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="recordType" type="xs:string" minOccurs="0" /> <xs:element name="servedIMSI" type="xs:string" minOccurs="0" /> <xs:element name="servedIMEI" type="xs:string" minOccurs="0" /> <xs:element name="servedMSISDN" type="xs:string" minOccurs="0" /> <xs:element name="callingNumber" type="xs:string" minOccurs="0" /> <xs:element name="recordingEntity" type="xs:string" minOccurs="0" /> <xs:element name="msClassmark" type="xs:string" minOccurs="0" /> <xs:element name="answerTime" type="xs:string" minOccurs="0" /> <xs:element name="releaseTime" type="xs:string" minOccurs="0" /> <xs:element name="callDuration" type="xs:string" minOccurs="0" /> <xs:element name="causeForTerm" type="xs:string" minOccurs="0" /> <xs:element name="callReference" type="xs:string" minOccurs="0" /> <xs:element name="networkCallReference" type="xs:string" minOccurs="0" /> <xs:element name="mSCAddress" type="xs:string" minOccurs="0" /> <xs:element name="speechVersionSupported" type="xs:string" minOccurs="0" /> <xs:element name="speechVersionUsed" type="xs:string" minOccurs="0" /> <xs:element name="classmark3" type="xs:string" minOccurs="0" /> <xs:element name="roamingNumber" type="xs:string" minOccurs="0" /> <xs:element name="mscIncomingCircuit" type="xs:string" minOccurs="0" /> <xs:element name="orgRNCorBSCId" type="xs:string" minOccurs="0" /> <xs:element name="orgMSCId" type="xs:string" minOccurs="0" /> <xs:element name="globalAreaID" type="xs:string" minOccurs="0" /> <xs:element name="subscriberCategory" type="xs:string" minOccurs="0" /> <xs:element name="firstmccmnc" type="xs:string" minOccurs="0" /> <xs:element name="lastmccmnc" type="xs:string" minOccurs="0" /> <xs:element ref="mscIncomingROUTE" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="mscOutgoingROUTE" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="location" minOccurs="0" maxOccurs="unbounded" /> <xs:element ref="basicService" minOccurs="0" maxOccurs="unbounded" /> <xs:element name="supplServicesUsed" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="SuppServiceUsedid" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="ssCode" type="xs:string" minOccurs="0" /> <xs:element name="ssTime" type="xs:string" minOccurs="0" /> </xs:sequence>

    Read the article

  • Performing Aggregate Functions on Multi-Million Row Tables

    - by Daniel Short
    I'm having some serious performance issues with a multi-million row table that I feel I should be able to get results from fairly quick. Here's a run down of what I have, how I'm querying it, and how long it's taking: I'm running SQL Server 2008 Standard, so Partitioning isn't currently an option I'm attempting to aggregate all views for all inventory for a specific account over the last 30 days. All views are stored in the following table: CREATE TABLE [dbo].[LogInvSearches_Daily]( [ID] [bigint] IDENTITY(1,1) NOT NULL, [Inv_ID] [int] NOT NULL, [Site_ID] [int] NOT NULL, [LogCount] [int] NOT NULL, [LogDay] [smalldatetime] NOT NULL, CONSTRAINT [PK_LogInvSearches_Daily] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY] ) ON [PRIMARY] This table has 132,000,000 records, and is over 4 gigs. A sample of 10 rows from the table: ID Inv_ID Site_ID LogCount LogDay -------------------- ----------- ----------- ----------- ----------------------- 1 486752 48 14 2009-07-21 00:00:00 2 119314 51 16 2009-07-21 00:00:00 3 313678 48 25 2009-07-21 00:00:00 4 298863 0 1 2009-07-21 00:00:00 5 119996 0 2 2009-07-21 00:00:00 6 463777 534 7 2009-07-21 00:00:00 7 339976 503 2 2009-07-21 00:00:00 8 333501 570 4 2009-07-21 00:00:00 9 453955 0 12 2009-07-21 00:00:00 10 443291 0 4 2009-07-21 00:00:00 (10 row(s) affected) I have the following index on LogInvSearches_Daily: /****** Object: Index [IX_LogInvSearches_Daily_LogDay] Script Date: 05/12/2010 11:08:22 ******/ CREATE NONCLUSTERED INDEX [IX_LogInvSearches_Daily_LogDay] ON [dbo].[LogInvSearches_Daily] ( [LogDay] ASC ) INCLUDE ( [Inv_ID], [LogCount]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] I need to pull inventory only from the Inventory for a specific account id. I have an index on the Inventory as well. I'm using the following query to aggregate the data and give me the top 5 records. This query is currently taking 24 seconds to return the 5 rows: StmtText ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- SELECT TOP 5 Sum(LogCount) AS Views , DENSE_RANK() OVER(ORDER BY Sum(LogCount) DESC, Inv_ID DESC) AS Rank , Inv_ID FROM LogInvSearches_Daily D (NOLOCK) WHERE LogDay DateAdd(d, -30, getdate()) AND EXISTS( SELECT NULL FROM propertyControlCenter.dbo.Inventory (NOLOCK) WHERE Acct_ID = 18731 AND Inv_ID = D.Inv_ID ) GROUP BY Inv_ID (1 row(s) affected) StmtText ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |--Top(TOP EXPRESSION:((5))) |--Sequence Project(DEFINE:([Expr1007]=dense_rank)) |--Segment |--Segment |--Sort(ORDER BY:([Expr1006] DESC, [D].[Inv_ID] DESC)) |--Stream Aggregate(GROUP BY:([D].[Inv_ID]) DEFINE:([Expr1006]=SUM([LOALogs].[dbo].[LogInvSearches_Daily].[LogCount] as [D].[LogCount]))) |--Sort(ORDER BY:([D].[Inv_ID] ASC)) |--Nested Loops(Inner Join, OUTER REFERENCES:([D].[Inv_ID])) |--Nested Loops(Inner Join, OUTER REFERENCES:([Expr1011], [Expr1012], [Expr1010])) | |--Compute Scalar(DEFINE:(([Expr1011],[Expr1012],[Expr1010])=GetRangeWithMismatchedTypes(dateadd(day,(-30),getdate()),NULL,(6)))) | | |--Constant Scan | |--Index Seek(OBJECT:([LOALogs].[dbo].[LogInvSearches_Daily].[IX_LogInvSearches_Daily_LogDay] AS [D]), SEEK:([D].[LogDay] > [Expr1011] AND [D].[LogDay] < [Expr1012]) ORDERED FORWARD) |--Index Seek(OBJECT:([propertyControlCenter].[dbo].[Inventory].[IX_Inventory_Acct_ID]), SEEK:([propertyControlCenter].[dbo].[Inventory].[Acct_ID]=(18731) AND [propertyControlCenter].[dbo].[Inventory].[Inv_ID]=[LOA (13 row(s) affected) I tried using a CTE to pick up the rows first and aggregate them, but that didn't run any faster, and gives me essentially the same execution plan. (1 row(s) affected) StmtText ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- --SET SHOWPLAN_TEXT ON; WITH getSearches AS ( SELECT LogCount -- , DENSE_RANK() OVER(ORDER BY Sum(LogCount) DESC, Inv_ID DESC) AS Rank , D.Inv_ID FROM LogInvSearches_Daily D (NOLOCK) INNER JOIN propertyControlCenter.dbo.Inventory I (NOLOCK) ON Acct_ID = 18731 AND I.Inv_ID = D.Inv_ID WHERE LogDay DateAdd(d, -30, getdate()) -- GROUP BY Inv_ID ) SELECT Sum(LogCount) AS Views, Inv_ID FROM getSearches GROUP BY Inv_ID (1 row(s) affected) StmtText ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |--Stream Aggregate(GROUP BY:([D].[Inv_ID]) DEFINE:([Expr1004]=SUM([LOALogs].[dbo].[LogInvSearches_Daily].[LogCount] as [D].[LogCount]))) |--Sort(ORDER BY:([D].[Inv_ID] ASC)) |--Nested Loops(Inner Join, OUTER REFERENCES:([D].[Inv_ID])) |--Nested Loops(Inner Join, OUTER REFERENCES:([Expr1008], [Expr1009], [Expr1007])) | |--Compute Scalar(DEFINE:(([Expr1008],[Expr1009],[Expr1007])=GetRangeWithMismatchedTypes(dateadd(day,(-30),getdate()),NULL,(6)))) | | |--Constant Scan | |--Index Seek(OBJECT:([LOALogs].[dbo].[LogInvSearches_Daily].[IX_LogInvSearches_Daily_LogDay] AS [D]), SEEK:([D].[LogDay] > [Expr1008] AND [D].[LogDay] < [Expr1009]) ORDERED FORWARD) |--Index Seek(OBJECT:([propertyControlCenter].[dbo].[Inventory].[IX_Inventory_Acct_ID] AS [I]), SEEK:([I].[Acct_ID]=(18731) AND [I].[Inv_ID]=[LOALogs].[dbo].[LogInvSearches_Daily].[Inv_ID] as [D].[Inv_ID]) ORDERED FORWARD) (8 row(s) affected) (1 row(s) affected) So given that I'm getting good Index Seeks in my execution plan, what can I do to get this running faster? Thanks, Dan

    Read the article

  • [Flex 4 and .Net] Retrieving tables from SQL database

    - by mG
    Hi everyone, As the title says, I want to retrieve tables of data from a SQL database, using Flex 4 and .Net WebService. I'm new to both Flex and DotNet. Please tell me a proper way to do it. This is what I've done so far: Retrieving an array of string: (this works) .Net: [WebMethod] public String[] getTestArray() { String[] arStr = { "AAA", "BBB", "CCC", "DDD" }; return arStr; } Flex 4: <?xml version="1.0" encoding="utf-8"?> <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/mx" minWidth="955" minHeight="600"> <fx:Script> <![CDATA[ import mx.collections.ArrayCollection; import mx.controls.Alert; import mx.rpc.events.ResultEvent; [Bindable] private var ac:ArrayCollection = new ArrayCollection(); protected function btn_clickHandler(event:MouseEvent):void { ws.getTestArray(); } protected function ws_resultHandler(event:ResultEvent):void { ac = event.result as ArrayCollection; Alert.show(ac.toString()); } ]]> </fx:Script> <fx:Declarations> <s:WebService id="ws" wsdl="http://localhost:50582/Service1.asmx?WSDL" result="ws_resultHandler(event)"/> </fx:Declarations> <s:Button x="10" y="30" label="Button" id="btn" click="btn_clickHandler(event)"/> </s:Application> Retrieving a DataTable: (this does not work) DotNet: [WebMethod] public DataTable getUsers() { DataTable dt = new DataTable("Users"); SqlConnection conn = new SqlConnection("server = 192.168.1.50; database = MyDatabase; user id = sa; password = 1234; integrated security = false"); SqlDataAdapter da = new SqlDataAdapter("select vFName, vLName, vEmail from Users", conn); da.Fill(dt); return dt; } Flex 4: <?xml version="1.0" encoding="utf-8"?> <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/mx" minWidth="955" minHeight="600"> <fx:Script> <![CDATA[ import mx.collections.ArrayCollection; import mx.controls.Alert; import mx.rpc.events.ResultEvent; [Bindable] private var ac:ArrayCollection = new ArrayCollection(); protected function btn_clickHandler(event:MouseEvent):void { ws.getUsers(); } protected function ws_resultHandler(event:ResultEvent):void { ac = event.result as ArrayCollection; Alert.show(ac.toString()); } ]]> </fx:Script> <fx:Declarations> <s:WebService id="ws" wsdl="http://localhost:50582/Service1.asmx?WSDL" result="ws_resultHandler(event)"/> </fx:Declarations> <s:Button x="10" y="30" label="Button" id="btn" click="btn_clickHandler(event)"/> </s:Application>

    Read the article

  • How do I make a Data Validation drop-down exclude blanks?

    - by Iszi
    Related: How can I use non-adjacent cells on another sheet for a Data Validation drop-down, and only show non-blank values? For now, I've worked around the above problem by re-arranging my sheet so all the Data Validation Source cells are in one range. I'm leaving the above question open though, because I think it still poses an interesting problem. However, the issue now is that the Data Validation drop-down isn't working in the way I expected it to (and how I believe others are telling me it should). Even though I've got everything into one named range, Excel still shows blanks in a drop-down that references that range. Setup: Sheet 1 A1= (blank) B1= Header A2= 1 B2= Value1 A3= 2 B3= Value2 A4= 3 B4= Value3 A5= 4 B5= (empty) A6= 5 B6= (empty) A7= 6 B7= (empty) Sheet1!B2:B7 is named Validation Sheet2!A1 is set to use Data Validation with a Source =Validation, and in-cell drop-down. The drop-down in Sheet2!A1 shows: Value1 Value2 Value3 . . . (Dots represent blank lines) How can I get rid of these blank lines in the in-cell drop-down, while still including Sheet1!B5:B7 in the Data Validation Source? Note: I nuked the sheet, and tried it again without column A from Sheet1 (putting values from column B in the above example into column A), and it worked fine. Adding Column A back though, brought the blanks back into the Data Validation drop-down. What do I need to do to keep column A as I want it and keep the in-cell drop-down clean?

    Read the article

  • Transparent Data Encryption Helps Customers Address Regulatory Compliance

    - by Troy Kitch
    Regulations such as the Payment Card Industry Data Security Standards (PCI DSS), U.S. state security breach notification laws, HIPAA HITECH and more, call for the use of data encryption or redaction to protect sensitive personally identifiable information (PII). From the outset, Oracle has delivered the industry's most advanced technology to safeguard data where it lives—in the database. Oracle provides a comprehensive portfolio of security solutions to ensure data privacy, protect against insider threats, and enable regulatory compliance for both Oracle and non-Oracle Databases. Organizations worldwide rely on Oracle Database Security solutions to help address industry and government regulatory compliance. Specifically, Oracle Advanced Security helps organizations like Educational Testing Service, TransUnion Interactive, Orbitz, and the National Marrow Donor Program comply with privacy and regulatory mandates by transparently encrypting sensitive information such as credit cards, social security numbers, and personally identifiable information (PII). By encrypting data at rest and whenever it leaves the database over the network or via backups, Oracle Advanced Security provides organizations the most cost-effective solution for comprehensive data protection. Watch the video and learn why organizations choose Oracle Advanced Security with transparent data encryption.

    Read the article

  • Adventures in Windows 8: Understanding and debugging design time data in Expression Blend

    - by Laurent Bugnion
    One of my favorite features in Expression Blend is the ability to attach a Visual Studio debugger to Blend. First let’s start by answering the question: why exactly do you want to do that? Note: If you are familiar with the creation and usage of design time data, feel free to scroll down to the paragraph titled “When design time data fails”. Creating design time data for your app When a designer works on an app, he needs to see something to design. For “static” UI such as buttons, backgrounds, etc, the user interface elements are going to show up in Blend just fine. If however the data is fetched dynamically from a service (web, database, etc) or created dynamically, most probably Blend is going to show just an empty element. The classical way to design at that stage is to run the application, navigate to the screen that is under construction (which can involve delays, need to log in, etc…), to measure what is on the screen (colors, margins, width and height, etc) using various tools, going back to Blend, editing the properties of the elements, running again, etc. Obviously this is not ideal. The solution is to create design time data. For more information about the creation of design time data by mocking services, you can refer to two talks of mine “Deep dive MVVM” and “MVVM Applied From Silverlight to Windows Phone to Windows 8”. The source code for these talks is here and here. Design time data in MVVM Light One of the main reasons why I developed MVVM Light is to facilitate the creation of design time data. To illustrate this, let’s create a new MVVM Light application in Visual Studio. Install MVVM Light from here: http://mvvmlight.codeplex.com (use the MSI in the Download section). After installing, make sure to read the Readme that opens up in your favorite browser, you will need one more step to install the Project Templates. Start Visual Studio 2012. Create a new MvvmLight (Win8) app. Run the application. You will see a string showing “Welcome to MVVM Light”. In the Solution explorer, right click on MainPage.xaml and select Open in Blend. Now you should see “Welcome to MVVM Light [Design]” What happens here is that Expression Blend runs different code at design time than the application runs at runtime. To do this, we use design-time detection (as explained in a previous article) and use that information to initialize a different data service at design time. To understand this better, open the ViewModelLocator.cs file in the ViewModel folder and see how the DesignDataService is used at design time, while the DataService is used at runtime. In a real-life applicationm, DataService would be used to connect to a web service, for instance. When design time data fails Sometimes however, the creation of design time data fails. It can be very difficult to understand exactly what is happening. Expression Blend is not giving a lot of information about what happened. Thankfully, we can use a trick: Attaching a debugger to Expression Blend and debug the design time code. In WPF and Silverlight (including Windows Phone 7), you could simply attach the debugger to Blend.exe (using the “Managed (v4.5, v4.0) code” option even for Silverlight!!) In Windows 8 however, things are just a bit different. This is because the designer that renders the actual representation of the Windows 8 app runs in its own process. Let’s illustrate that: Open the file DesignDataService in the Design folder. Modify the GetData method to look like this: public void GetData(Action<DataItem, Exception> callback) { throw new Exception(); // Use this to create design time data var item = new DataItem("Welcome to MVVM Light [design]"); callback(item, null); } Go to Blend and build the application. The build succeeds, but now the page is empty. The creation of the design time data failed, but we don’t get a warning message. We need to investigate what’s wrong. Close MainPage.xaml Go to Visual Studio and select the menu Debug, Attach to Process. Update: Make sure that you select “Managed (v4.5, v4.0) code” in the “Attach to” field. Find the process named XDesProc.exe. You should have at least two, one for the Visual Studio 2012 designer surface, and one for Expression Blend. Unfortunately in this screen it is not obvious which is which. Let’s find out in the Task Manager. Press Ctrl-Alt-Del and select Task Manager Go to the Details tab and sort the processes by name. Find the one that says “Blend for Microsoft Visual Studio 2012 XAML UI Designer” and write down the process ID. Go back to the Attach to Process dialog in Visual Studio. sort the processes by ID and attach the debugger to the correct instance of XDesProc.exe. Open the MainViewModel (in the ViewModel folder) Place a breakpoint on the first line of the MainViewModel constructor. Go to Blend and open the MainPage.xaml again. At this point, the debugger breaks in Visual Studio and you can execute your code step by step. Simply step inside the dataservice call, and find the exception that you had placed there. Visual Studio gives you additional information which helps you to solve the issue. More info and Conclusion I want to thank the amazing people on the Expression Blend team for being very fast in guiding me in that matter and encouraging me to blog about it. More information about the XDesProc.exe process can be found here. I had to work on a Windows 8 app for a few days without design time data because of an Exception thrown somewhere in the code, and it was really painful. With the debugger, finding the issue was a simple matter of stepping into the code until it threw the exception.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • testdisk - recover partition table

    - by Evaggelos Balaskas
    I destroyed my partition table of my laptop. Testdisk reports the below Disk laptop.img - 250 GB / 232 GiB - CHS 30402 255 63 (RO) Partition Start End Size in sectors >P MS Data 435868 456606 20739 [NO NAME] P MS Data 19232600 19235479 2880 [NO NAME] D MS Data 41945087 83890143 41945057 D MS Data 57151486 168579069 111427584 D MS Data 67637246 141037565 73400320 D MS Data 151523326 193466365 41943040 D MS Data 170617328 170618223 896 D MS Data 170631168 170634047 2880 D MS Data 171338232 171344405 6174 [Boot] D MS Data 172008235 172231918 223684 [NO NAME] P MS Data 193466368 214437887 20971520 D MS Data 217321375 225321678 8000304 [root] D MS Data 224923646 308809725 83886080 [media] D MS Data 308809728 420237311 111427584 D MS Data 418910206 481824765 62914560 [vmimages] my partition table had 3 Primary Partitions. 1. WinXP Home 2. /boot 3. LVM inside LVM i had 9 or 10 LVM partitions One of them was my home (encrypted with luks) testdisk cant recover my partition table or any other partition. Partitions with [P] doesnt have any useful data. I want to use dd to extract the partitions and try to recover as many files i can. Any ideas of how i can extract eg. the [root] lvm partition from the above testdisk report ? I am afraid that my disk was also corrupted.

    Read the article

  • Oracle Data Integrator Demo Webcast - Next Webcast - November 21st, 2013

    - by Javier Puerta
    Oracle Data Integrator Demo Webcast Next Webcast - November 21st, 2013 The ODI Product Management team will be hosting a demonstration webcast of Oracle Data Integrator regularly. We will be showing baseline functionality, and covering special topics as requested by our customers. Attendance to these webcasts is open to customers and partners Webcast Format The same format for the Webcast will be followed for each presentation: 05 minutes - Background & Overview 30 minutes - Introduction to ODI Features 15 minutes - Drill-Down into Special Topics 10 minutes - Questions and Answers Next Webcast Special Topics Oracle Data Integrator 12c Webcast Details Thursday November 21st 2013, 10:00 AM PST | 1:00 PM EST | 6:00 PM CET (1 hour) Web Conference Link: 594 942 837 (https://oracleconferencing.webex.com) Dial-In Number: AMER: 1-866-682-4770 (More Numbers) Phone Meeting ID/Passcode: 3096713/505638 More information on Oracle Data Integrator (ODI) Learn more about Oracle Data Integrator. Download Oracle Data Integrator 12c. Oracle Data Integrator Webcast Archive Copyright © 2013, Oracle. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement

    Read the article

  • Contricted A* problem

    - by Ragekit
    I've got a little problem with an A* algorithm that I need to constrict a little bit. Basically : I use an A* to find the shortest path between 2 randomly placed room in 3D space, and then build a corridor between them. The problem I found is that sometimes it makes chimney like corridors that are not ideal, so I constrict the A* so that if the last movement was up or down, you go sideways. Everything is fine, but in some corner cases, it fails to find a path (when there is obviously one). Like here between the blue and red dot : (i'm in unity btw, but i don't think it matters) Here is the code of the actual A* (a bit long, and some redundency) while(current != goal) { //add stair up / stair down foreach(Node<GridUnit> test in current.Neighbors) { if(!test.Data.empty && test != goal) continue; //bug at arrival; if(test == goal && penul !=null) { Vector3 currentDiff = current.Data.bounds.center - test.Data.bounds.center; if(!Mathf.Approximately(currentDiff.y,0)) { //wanna drop on the last if(!coplanar(test.Data.bounds.center,current.Data.bounds.center,current.Data.parentUnit.bounds.center,to.Data.bounds.center)) { continue; } else { if(Mathf.Approximately(to.Data.bounds.center.x, current.Data.parentUnit.bounds.center.x) && Mathf.Approximately(to.Data.bounds.center.z, current.Data.parentUnit.bounds.center.z)) { continue; } } } } if(current.Data.parentUnit != null) { Vector3 previousDiff = current.Data.parentUnit.bounds.center - current.Data.bounds.center; Vector3 currentDiff = current.Data.bounds.center - test.Data.bounds.center; if(!Mathf.Approximately(previousDiff.y,0)) { if(!Mathf.Approximately(currentDiff.y,0)) { //you wanna drop now : continue; } if(current.Data.parentUnit.parentUnit != null) { if(!coplanar(test.Data.bounds.center,current.Data.bounds.center,current.Data.parentUnit.bounds.center,current.Data.parentUnit.parentUnit.bounds.center)) { continue; }else { if(Mathf.Approximately(test.Data.bounds.center.x, current.Data.parentUnit.parentUnit.bounds.center.x) && Mathf.Approximately(test.Data.bounds.center.z, current.Data.parentUnit.parentUnit.bounds.center.z)) { continue; } } } } } g = current.Data.g + HEURISTIC(current.Data,test.Data); h = HEURISTIC(test.Data,goal.Data); f = g + h; if(open.Contains(test) || closed.Contains(test)) { if(test.Data.f > f) { //found a shorter path going passing through that point test.Data.f = f; test.Data.g = g; test.Data.h = h; test.Data.parentUnit = current.Data; } } else { //jamais rencontré test.Data.f = f; test.Data.h = h; test.Data.g = g; test.Data.parentUnit = current.Data; open.Add(test); } } closed.Add (current); if(open.Count == 0) { Debug.Log("nothingfound"); //nothing more to test no path found, stay to from; List<GridUnit> r = new List<GridUnit>(); r.Add(from.Data); return r; } //sort open from small to biggest travel cost open.Sort(delegate(Node<GridUnit> x, Node<GridUnit> y) { return (int)(x.Data.f-y.Data.f); }); //get the smallest travel cost node; Node<GridUnit> smallest = open[0]; current = smallest; open.RemoveAt(0); } //build the path going backward; List<GridUnit> ret = new List<GridUnit>(); if(penul != null) { ret.Insert(0,to.Data); } GridUnit cur = goal.Data; ret.Insert(0,cur); do{ cur = cur.parentUnit; ret.Insert(0,cur); } while(cur != from.Data); return ret; You see at the start of the foreach i constrict the A* like i said. If you have any insight it would be cool. Thanks

    Read the article

  • Combine auto-syncing cloud and VCS

    - by ComFreek
    This question brought me to another question: is there any VCS/tool for a VCS which automatically backups your source code between the last checkout and current changes? I had the problem of loosing uncommited source code changes just one week ago. I did not want to commit yet because the changes were incomplete. But then, an error when moving the data to an USB stick caused the data loss. That's the opposite what a cloud service (like Google Drive, SkyDrive, DropBox, ...) does: it tracks each change you made! Have you lost your data? That's no problem because you have the latest version online. So what would a combined solution look like? It would offer full functionality of a VCS including auto-syncing of any intermediate changes between two commits/checkouts to a temporary online location.

    Read the article

  • Extracting, Transforming, and Loading (ETL) Process

    The process of Extracting, Transforming, and Loading data in to a data warehouse is called Extract Transform Load (ETL) process.  This process can be used to obtain, analyze, and clean data from various data sources so that it can be stored in a uniform manner within a data warehouse. This data can then be used by various business intelligence processes to provide an organization with more of an in depth analysis of the current state of the company and where it is heading. A standard ETL process that might be used by a health care system may include importing all of their patients names, diagnoses and prescriptions in to a unified data warehouse so that trends can be spotted in regards to outbreaks like the flu and also predict potential illness that a patient might be affected by based on other patients with similar symptoms.

    Read the article

  • SQL Server 2005 Disk Configuration: Single RAID 1+0 or multiple RAID 1+0s?

    - by mfredrickson
    Assuming that the workload for the SQL Server is just a normal OLTP database, and that there are a total of 20 disks available, which configuration would make more sense? A single RAID 1+0, containing all 20 disks. This physical volume would contain both the data files and the transaction log files, but two logical drives would be created from this RAID: one for the data files and one for the log files. Or... Two RAID 1+0s, each containing 10 disks. One physical volume would contain the data files, and the other would contain the log files. The reason for this question is due to a disagreement between me (SQL Developer) and a co-worker (DBA). For every configuration that I've done, or seen others do, the data files and transaction log files were separated at the physical level, and were placed on separate RAIDs. However, my co-workers argument is that by placing all the disks into a single RAID 1+0, then any IO that is done by the server is potentially shared between all 20 disks, instead of just 10 disks in my suggested configuration. Conceptually, his argument makes sense to me. Also, I've found some information from Microsoft that seems to supports his position. http://technet.microsoft.com/en-us/library/cc966414.aspx In the section titled "3. RAID10 Configuration", showing a configuration in which all 20 disks are allocated to a single RAID 1+0, it states: In this scenario, the I/O parallelism can be used to its fullest by all partitions. Therefore, distribution of I/O workload is among 20 physical spindles instead of four at the partition level. But... every other configuration I've seen suggests physically separating the data and log files onto separate RAIDs. Everything I've found here on Server Fault suggests the same. I understand that a log files will be write heavy, and that data files will be a combination of reads and writes, but does this require that the files be placed onto separate RAIDs instead of a single RAID?

    Read the article

  • How to eager load sibling data using LINQ to SQL?

    - by Scott
    The goal is to issue the fewest queries to SQL Server using LINQ to SQL without using anonymous types. The return type for the method will need to be IList<Child1>. The relationships are as follows: Parent Child1 Child2 Grandchild1 Parent Child1 is a one-to-many relationship Child1 Grandchild1 is a one-to-n relationship (where n is zero to infinity) Parent Child2 is a one-to-n relationship (where n is zero to infinity) I am able to eager load the Parent, Child1 and Grandchild1 data resulting in one query to SQL Server. This query with load options eager loads all of the data, except the sibling data (Child2): DataLoadOptions loadOptions = new DataLoadOptions(); loadOptions.LoadWith<Child1>(o => o.GrandChild1List); loadOptions.LoadWith<Child1>(o => o.Parent); dataContext.LoadOptions = loadOptions; IQueryable<Child1> children = from child in dataContext.Child1 select child; I need to load the sibling data as well. One approach I have tried is splitting the query into two LINQ to SQL queries and merging the result sets together (not pretty), however upon accessing the sibling data it is lazy loaded anyway. Adding the sibling load option will issue a query to SQL Server for each Grandchild1 and Child2 record (which is exactly what I am trying to avoid): DataLoadOptions loadOptions = new DataLoadOptions(); loadOptions.LoadWith<Child1>(o => o.GrandChild1List); loadOptions.LoadWith<Child1>(o => o.Parent); loadOptions.LoadWith<Parent>(o => o.Child2List); dataContext.LoadOptions = loadOptions; IQueryable<Child1> children = from child in dataContext.Child1 select child; exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=1 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=2 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=3 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=4 I've also written LINQ to SQL queries to join in all of the data in hopes that it would eager load the data, however when the LINQ to SQL EntitySet of Child2 or Grandchild1 are accessed it lazy loads the data. The reason for returning the IList<Child1> is to hydrate business objects. My thoughts are I am either: Approaching this problem the wrong way. Have the option of calling a stored procedure? My organization should not be using LINQ to SQL as an ORM? Any help is greatly appreciated. Thank you, -Scott

    Read the article

  • Why isnt sql management studio integrated in visual studio?

    - by Rob Packwood
    I have both SQL Server 2005 and Visual Studio 2008 installed and think it would be really nice to have SQL Management Studio integrated directly within Visual Studio. Is there a way to make that happen? What about in VS 2010 with SQL Server 2008? I find the Visual Studio Server Explorer window to be much slower too than the Object Browser in SQL Server's Management Studio... it would be nice to never really need to use the Server Explorer.

    Read the article

  • Which is the "best" data access framework/approach for C# and .NET?

    - by Frans
    (EDIT: I made it a community wiki as it is more suited to a collaborative format.) There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends". However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application. I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go. So far, this is my understanding at a high level - but I am sure it is wrong... I am primarily focusing on the Microsoft approaches to keep this focused. ADO.NET Entity Framework Database agnostic Good because it allows swapping backends in and out Bad because it can hit performance and database vendors are not too happy about it Seems to be MS's preferred route for the future Complicated to learn (though, see 267357) It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code LINQ to SQL Uncertain future (see Is LINQ to SQL truly dead?) Easy to learn (?) Only works with MS SQL Server See also Pros and cons of LINQ "Standard" ADO.NET No ORM No abstraction so you are back to "roll your own" and play with dynamically generated SQL Direct access, allows potentially better performance This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend. Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures. ASP.NET Data Source Controls Are these something altogether different or just a layer over standard ADO.NET? - Would you really use these if you had a DAL or if you implemented LINQ or Entities? NHibernate Seems to be a very powerful and powerful ORM? Open source Some other relevant links; NHibernate or LINQ to SQL Entity Framework vs LINQ to SQL

    Read the article

< Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >