Search Results

Search found 19479 results on 780 pages for 'total record count'.

Page 88/780 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • Help Creating a Google Analytics Funnel for Check out process

    - by Drew
    have a funnel question. I am currently working on tracking (through GA) guest and logged in member activity once they get to my sites shopping cart. But need help with setting up funnels. Specifically to see; Total sales Logged in member total sales List item Guest member sales The urls associated to the check out proces are: Logged in members /cart (arriving to checkout) /checkout (checking out as a logged in member) /checkout/confirmation (thank you - confirmed sale) Guest members - /cart (arriving to checkout) - /checkout-guest (checking out as a guest) - /checkout/confirmation (thanks you - confirmed sale) I've tested the funnels set up for the above with 9 transactions. But the end maths doesn't seem to line up. Total sales funnel shows 9 completed transactions when only tracking these to urls: - /cart - /checkout/confirmation Which is great - cause it's working Logged in member sales show a total of 9 completed transactions based on each step of the logged in url steps (above) being tracked in a funnel. Not good because this number should be 3. Guest check out funnel (see guest steps above) shows 9 as well. What the?!?!?!? The results I am looking for should reflect the following - total sales = 9, logged in members = 3, guest members = 6 Is there any way to set these urls up so that the funnels report the correct results - or do I need to changed the urls and provide logged in members and guest stand alone purchase confirmation pages (this would mean I can not track total sales which combine results from both streams)? Any knowledge in this area is welcome. Thanks.

    Read the article

  • Choosing Technology To Include In Software Design

    How many of us have been forced to select one technology over another when designing a new system? What factors do we and should we consider? How can we ensure the correct business decision is made? When faced with this type of decision it is important to gather as much information possible regarding each technology being considered as well as the project itself. Additionally, I tend to delay my decision about the technology until it is ultimately necessary to be made. The reason why I tend to delay such an important design decision is due to the fact that as the project progresses requirements and other factors can alter a decision for selecting the best technology for a project. Important factors to consider when making technology decisions: Time to Implement and Maintain Total Cost of Technology (including Implementation and maintenance) Adaptability of Technology Implementation Team’s Skill Sets Complexity of Technology (including Implementation and maintenance) orecasted Return On Investment (ROI) Forecasted Profit on Investment (POI) Of the factors to consider the ROI and POI weigh the heaviest because the take in to consideration the other factors when calculating the profitability and return on investments.For a real world example let us consider developing a web based lead management system for a new company. This system can either be hosted on Microsoft Windows based web server or on a Linux based web server. Important Factors for this Example Implementation Team’s Skill Sets Member 1  Skill Set: Classic ASP, ASP.Net, and MS SQL Server Experience: 10 years Member 2  Skill Set: PHP, MySQL, Photoshop and MS SQL Server Experience: 3 years Member 3  Skill Set: C++, VB6, ASP.Net, and MS SQL Server Experience: 12 years Total Cost of Technology (including Implementation and maintenance) Linux Initial Year: $5,000 (Random Value) Additional Years: $3,000 (Random Value) Windows Initial Year: $10,000 (Random Value) Additional Years: $3,000 (Random Value) Complexity of Technology Linux Large Learning Curve with user driven documentation Estimated learning cost: $30,000 Windows Minimal based on Teams skills with Microsoft based documentation Estimated learning cost: $5,000 ROI Linux Total Cost Initial Total Cost: $35,000 Additional Cost $3,000 per year Windows Total Cost Initial Total Cost: $15,000 Additional Cost $3,000 per year Based on the hypothetical numbers it would make more sense to select windows based web server because the initial investment of the technology is much lower initially compared to the Linux based web server.

    Read the article

  • How to write PowerShell code part 2 (Using function)

    - by ybbest
    In the last post, I have showed you how to use external configuration file in your PowerShell script. In this post, I will show you how to create PowerShell function and call external PowerShell script.You can download the script here. 1. In the original script, I create the site directly using New-SPSite command. I will refactor it so that I will create a new function to create the site using New-SPSite. The PowerShell function is quite similar to a C# method. You put your function parameters in () and separate each parameter by a comma (,). Then you put your method body in {}. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } 2. The difference is you do not need semi-colon (;) at the end of each statement and when calling the method you do not need comma (,) to separate each parameter. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } #Calling the function [int] $num1=3 [int] $num2=4 $d= add $num1 $num2 Write-Host $d 3. If you like to return anything from the function, you just need to type in the object you like to return, not need to type return .e.g. $ObjectToReturn not return $ObjectToReturn

    Read the article

  • How to write PowerShell code part 2 (Using function)

    - by ybbest
    In the last post, I have showed you how to use external configuration file in your PowerShell script. In this post, I will show you how to create PowerShell function and call external PowerShell script.You can download the script here. 1. In the original script, I create the site directly using New-SPSite command. I will refactor it so that I will create a new function to create the site using New-SPSite. The PowerShell function is quite similar to a C# method. You put your function parameters in () and separate each parameter by a comma (,). Then you put your method body in {}. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } 2. The difference is you do not need semi-colon (;) at the end of each statement and when calling the method you do not need comma (,) to separate each parameter. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } #Calling the function [int] $num1=3 [int] $num2=4 $d= add $num1 $num2 Write-Host $d 3. If you like to return anything from the function, you just need to type in the object you like to return, not need to type return .e.g. $ObjectToReturn not return $ObjectToReturn

    Read the article

  • I need a form to tie MySQL Records together.

    - by John R
    I need a HTML-based form to tie database records together. The user scenario that I envision is somthing like this: A super-user does a search of the database and is delivered a table of records based on that search; each record is numbered with the database-record id. There would also be two text fields next to each record. These text fields would allow: the user to make a reference from one record to another. For example, a user could enter into record id #457 the integer 242 to indicate that there is a correlation with record ID #242. The user would also describe the type of relationship it is. This could be accomplished with a simple integer in the second field that indicates the type of relationship between the two records. When the super-user hits the submit button, all of these relationships are saved in a mySqljoin table. One option is to give me advice on how to implement and code this myself in PHP. However, before I reinvent the wheel, another option is to lead me to a free script that does something similar.

    Read the article

  • How are objects modelled in a functional programming language?

    - by Giorgio
    In an answer to this question (written by Pete) there are some considerations about OOP versus FP. In particular, it is suggested that FP languages are not very suitable for modelling (persistent) objects that have an identity and a mutable state. I was wondering if this is true or, in other words, how one would model objects in a functional programming language. From my basic knowledge of Haskell I thought that one could use monads in some way, but I really do not know enough on this topic to come up with a clear answer. So, how are entities with an identity and a mutable persistent state normally modelled in a functional language? EDIT Here are some further details to clarify what I have in mind. Take a typical Java application in which I can (1) read a record from a database table into a Java object, (2) modify the object in different ways, (3) save the modified object to the database. How would this be implemented e.g. in Haskell? I would initially read the record into a record value (defined by a data definition), perform different transformations by applying functions to this initial value (each intermediate value is a new, modified copy of the original record) and then write the final record value to the database. Is this all there is to it? How can I ensure that at each moment in time only one copy of the record is valid / accessible? One does not want to have different immutable values representing different snapshots of the same object to be accessible at the same time.

    Read the article

  • I need a form to tie MySQL Records together (i.e., many-to-many relationships).

    - by John R
    I need a HTML-based form to tie database records together. The user scenario that I envision is somthing like this: A super-user does a search of the database and is delivered a table of records based on that search; each record is numbered with the database-record id. There would also be two text fields next to each record. These text fields would allow: the user to make a reference from one record to another. For example, a user could enter into record id #457 that there is a correlation with record ID #242. The user would also describe the type of relationship it is. This could be with a simple integer that represents the type of relationship between the two records. When the super-user hits the submit button, all of these relationships are saved in a mySqljoin table. One option is to give me advice on how to implement and code this myself in PHP. However, before I reinvent the wheel, another option is to lead me to a free script that does something similar.

    Read the article

  • Advantage Database Server: slow stored procedure performance.

    - by ie
    I have a question about a performance of stored procedures in the ADS. I created a simple database with the following structure: CREATE TABLE MainTable ( Id INTEGER PRIMARY KEY, Name VARCHAR(50), Value INTEGER ); CREATE UNIQUE INDEX MainTableName_UIX ON MainTable ( Name ); CREATE TABLE SubTable ( Id INTEGER PRIMARY KEY, MainId INTEGER, Name VARCHAR(50), Value INTEGER ); CREATE INDEX SubTableMainId_UIX ON SubTable ( MainId ); CREATE UNIQUE INDEX SubTableName_UIX ON SubTable ( Name ); CREATE PROCEDURE CreateItems ( MainName VARCHAR ( 20 ), SubName VARCHAR ( 20 ), MainValue INTEGER, SubValue INTEGER, MainId INTEGER OUTPUT, SubId INTEGER OUTPUT ) BEGIN DECLARE @MainName VARCHAR ( 20 ); DECLARE @SubName VARCHAR ( 20 ); DECLARE @MainValue INTEGER; DECLARE @SubValue INTEGER; DECLARE @MainId INTEGER; DECLARE @SubId INTEGER; @MainName = (SELECT MainName FROM __input); @SubName = (SELECT SubName FROM __input); @MainValue = (SELECT MainValue FROM __input); @SubValue = (SELECT SubValue FROM __input); @MainId = (SELECT MAX(Id)+1 FROM MainTable); @SubId = (SELECT MAX(Id)+1 FROM SubTable ); INSERT INTO MainTable (Id, Name, Value) VALUES (@MainId, @MainName, @MainValue); INSERT INTO SubTable (Id, Name, MainId, Value) VALUES (@SubId, @SubName, @MainId, @SubValue); INSERT INTO __output SELECT @MainId, @SubId FROM system.iota; END; CREATE PROCEDURE UpdateItems ( MainName VARCHAR ( 20 ), MainValue INTEGER, SubValue INTEGER ) BEGIN DECLARE @MainName VARCHAR ( 20 ); DECLARE @MainValue INTEGER; DECLARE @SubValue INTEGER; DECLARE @MainId INTEGER; @MainName = (SELECT MainName FROM __input); @MainValue = (SELECT MainValue FROM __input); @SubValue = (SELECT SubValue FROM __input); @MainId = (SELECT TOP 1 Id FROM MainTable WHERE Name = @MainName); UPDATE MainTable SET Value = @MainValue WHERE Id = @MainId; UPDATE SubTable SET Value = @SubValue WHERE MainId = @MainId; END; CREATE PROCEDURE SelectItems ( MainName VARCHAR ( 20 ), CalculatedValue INTEGER OUTPUT ) BEGIN DECLARE @MainName VARCHAR ( 20 ); @MainName = (SELECT MainName FROM __input); INSERT INTO __output SELECT m.Value * s.Value FROM MainTable m INNER JOIN SubTable s ON m.Id = s.MainId WHERE m.Name = @MainName; END; CREATE PROCEDURE DeleteItems ( MainName VARCHAR ( 20 ) ) BEGIN DECLARE @MainName VARCHAR ( 20 ); DECLARE @MainId INTEGER; @MainName = (SELECT MainName FROM __input); @MainId = (SELECT TOP 1 Id FROM MainTable WHERE Name = @MainName); DELETE FROM SubTable WHERE MainId = @MainId; DELETE FROM MainTable WHERE Id = @MainId; END; Actually, the problem I had - even so light stored procedures work very-very slow (about 50-150 ms) relatively to plain queries (0-5ms). To test the performance, I created a simple test (in F# using ADS ADO.NET provider): open System; open System.Data; open System.Diagnostics; open Advantage.Data.Provider; let mainName = "main name #"; let subName = "sub name #"; // INSERT let cmdTextScriptInsert = " DECLARE @MainId INTEGER; DECLARE @SubId INTEGER; @MainId = (SELECT MAX(Id)+1 FROM MainTable); @SubId = (SELECT MAX(Id)+1 FROM SubTable ); INSERT INTO MainTable (Id, Name, Value) VALUES (@MainId, :MainName, :MainValue); INSERT INTO SubTable (Id, Name, MainId, Value) VALUES (@SubId, :SubName, @MainId, :SubValue); SELECT @MainId, @SubId FROM system.iota;"; let cmdTextProcedureInsert = "CreateItems"; // UPDATE let cmdTextScriptUpdate = " DECLARE @MainId INTEGER; @MainId = (SELECT TOP 1 Id FROM MainTable WHERE Name = :MainName); UPDATE MainTable SET Value = :MainValue WHERE Id = @MainId; UPDATE SubTable SET Value = :SubValue WHERE MainId = @MainId;"; let cmdTextProcedureUpdate = "UpdateItems"; // SELECT let cmdTextScriptSelect = " SELECT m.Value * s.Value FROM MainTable m INNER JOIN SubTable s ON m.Id = s.MainId WHERE m.Name = :MainName;"; let cmdTextProcedureSelect = "SelectItems"; // DELETE let cmdTextScriptDelete = " DECLARE @MainId INTEGER; @MainId = (SELECT TOP 1 Id FROM MainTable WHERE Name = :MainName); DELETE FROM SubTable WHERE MainId = @MainId; DELETE FROM MainTable WHERE Id = @MainId;"; let cmdTextProcedureDelete = "DeleteItems"; let cnnStr = @"data source=D:\DB\test.add; ServerType=local; user id=adssys; password=***;"; let cnn = new AdsConnection(cnnStr); try cnn.Open(); let cmd = cnn.CreateCommand(); let parametrize ix prms = cmd.Parameters.Clear(); let addParam = function | "MainName" -> cmd.Parameters.Add(":MainName" , mainName + ix.ToString()) |> ignore; | "SubName" -> cmd.Parameters.Add(":SubName" , subName + ix.ToString() ) |> ignore; | "MainValue" -> cmd.Parameters.Add(":MainValue", ix * 3 ) |> ignore; | "SubValue" -> cmd.Parameters.Add(":SubValue" , ix * 7 ) |> ignore; | _ -> () prms |> List.iter addParam; let runTest testData = let (cmdType, cmdName, cmdText, cmdParams) = testData; let toPrefix cmdType cmdName = let prefix = match cmdType with | CommandType.StoredProcedure -> "Procedure-" | CommandType.Text -> "Script -" | _ -> "Unknown -" in prefix + cmdName; let stopWatch = new Stopwatch(); let runStep ix prms = parametrize ix prms; stopWatch.Start(); cmd.ExecuteNonQuery() |> ignore; stopWatch.Stop(); cmd.CommandText <- cmdText; cmd.CommandType <- cmdType; let startId = 1500; let count = 10; for id in startId .. startId+count do runStep id cmdParams; let elapsed = stopWatch.Elapsed; Console.WriteLine("Test '{0}' - total: {1}; per call: {2}ms", toPrefix cmdType cmdName, elapsed, Convert.ToInt32(elapsed.TotalMilliseconds)/count); let lst = [ (CommandType.Text, "Insert", cmdTextScriptInsert, ["MainName"; "SubName"; "MainValue"; "SubValue"]); (CommandType.Text, "Update", cmdTextScriptUpdate, ["MainName"; "MainValue"; "SubValue"]); (CommandType.Text, "Select", cmdTextScriptSelect, ["MainName"]); (CommandType.Text, "Delete", cmdTextScriptDelete, ["MainName"]) (CommandType.StoredProcedure, "Insert", cmdTextProcedureInsert, ["MainName"; "SubName"; "MainValue"; "SubValue"]); (CommandType.StoredProcedure, "Update", cmdTextProcedureUpdate, ["MainName"; "MainValue"; "SubValue"]); (CommandType.StoredProcedure, "Select", cmdTextProcedureSelect, ["MainName"]); (CommandType.StoredProcedure, "Delete", cmdTextProcedureDelete, ["MainName"])]; lst |> List.iter runTest; finally cnn.Close(); And I'm getting the following results: Test 'Script -Insert' - total: 00:00:00.0292841; per call: 2ms Test 'Script -Update' - total: 00:00:00.0056296; per call: 0ms Test 'Script -Select' - total: 00:00:00.0051738; per call: 0ms Test 'Script -Delete' - total: 00:00:00.0059258; per call: 0ms Test 'Procedure-Insert' - total: 00:00:01.2567146; per call: 125ms Test 'Procedure-Update' - total: 00:00:00.7442440; per call: 74ms Test 'Procedure-Select' - total: 00:00:00.5120446; per call: 51ms Test 'Procedure-Delete' - total: 00:00:01.0619165; per call: 106ms The situation with the remote server is much better, but still a great gap between plaqin queries and stored procedures: Test 'Script -Insert' - total: 00:00:00.0709299; per call: 7ms Test 'Script -Update' - total: 00:00:00.0161777; per call: 1ms Test 'Script -Select' - total: 00:00:00.0258113; per call: 2ms Test 'Script -Delete' - total: 00:00:00.0166242; per call: 1ms Test 'Procedure-Insert' - total: 00:00:00.5116138; per call: 51ms Test 'Procedure-Update' - total: 00:00:00.3802251; per call: 38ms Test 'Procedure-Select' - total: 00:00:00.1241245; per call: 12ms Test 'Procedure-Delete' - total: 00:00:00.4336334; per call: 43ms Is it any chance to improve the SP performance? Please advice. ADO.NET driver version - 9.10.2.9 Server version - 9.10.0.9 (ANSI - GERMAN, OEM - GERMAN) Thanks!

    Read the article

  • How to count the most recent value based on multiple criteria?

    - by Andrew
    I keep a log of phone calls like the following where the F column is LVM = Left Voice Mail, U = Unsuccessful, S = Successful. A1 1 B1 Smith C1 John D1 11/21/2012 E1 8:00 AM F1 LVM A2 2 B2 Smith C2 John D2 11/22/2012 E1 8:15 AM F2 U A3 3 B3 Harvey C3 Luke D3 11/22/2012 E1 8:30 AM F3 S A4 4 B4 Smith C4 John D4 11/22/2012 E1 9:00 AM F4 S A5 5 B5 Smith C5 John D5 11/23/2012 E5 8:00 AM F5 LVM This is a small sample. I actually have over 700 entries. In my line of work, it is important to know how many unsuccessful (LVM or U) calls I have made since the last Successful one (S). Since values in the F column can repeat, I need to take into consideration both the B and C column. Also, since I can make a successful call with a client and then be trying to contact them again, I need to be able to count from the last successful call. My G column is completely open which is where I would like to put a running total for each client (G5 would = 1 ideally while G4 = 0, G3 = 0, G2 = 2, G1 = 1 but I want these values calculated automatically so that I do not have scroll through 700 names).

    Read the article

  • How to record tv to network share with Windows Media Center?

    - by Peterdk
    Well, you would think that Windows 7's new MediaCenter would be up to the task of recording your TV to a network share/drive. Too bad, it looks like it's just not possible. I have a windows 2008 R2 server, and a Windows 7 machine with a TV card. Since my server has 2TB of storage, it would be nice to record directly to it's networked drive. (I mounted it as Z:). I tried the following: Selecting it in Media Center Itself: Not working. Not available. Editing the registry: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Media Center\Service\Recording , setting RecordPath to Z:\TV. Not working. Editing the registry: setting RecordPath to \\server\TV. Not working. Creating a Symlink (mklink \D) to Z:\TV and \\server\TV and setting that in the registry as RecordPath. Currently I am out of options. I could ofcourse Install Windows7 on my server, but I have no license for that, and my windows 2008 r2 is free from dreamspark. Are there people that are succesfully recording to a networked drive/storage? edit I also need to mention that I need to be able to acces the stored files from other PC's, like my laptop. So iSCSI is great for recording, but it looks like you can't access iSCSI devices from multiple PC's. Looks like sharing a iSCSI device is out of the question, so: Are there workarounds to get this thing recording to my network drive?

    Read the article

  • Currency Conversion in Oracle BI applications

    - by Saurabh Verma
    Authored by Vijay Aggarwal and Hichem Sellami A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc. With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business. This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users. Source Systems OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users. For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly. OBIA Design The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance. When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies. OBIA Facts store amounts in various currencies: Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies. Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company. Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation. The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics. When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard. Design Logic When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts. If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates. The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA. The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion. Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters. Some Gotchas to Look for It is necessary to think through the currencies during the initial implementation. 1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time. 2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done. Technical Section This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system. In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables. W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly. EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G. Delete the records which have Start Date > minimum conversion date Update the End Date of the existing records. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date). Filter the records marked as ‘Filter’ in the INFA map. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly. SIL map will then insert/update the GS data into W_EXCH_RATE_G. These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G. We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this. Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly. PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only. 1 Jan 2010 – USD to INR – 45 31 Jan 2010 – USD to INR – 46 PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW. Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate. We generate From Date and To Date while extracting data from source and this has certain assumptions: If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected. Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type. Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards. As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked. Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA. W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well. What if all 5 Global Currencies configured are same? As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record. We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases. Reusable Lookups Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned. Changing Currency preferences in the Dashboard: In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution. List of Currency Preferences Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles. Adding Currency to an Amount Field When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error. There are two ways to add the Currency field to an amount metric: In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”… In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area. Select Column Properties option in the Edit list of a metric. In the Data Format tab, select Custom as Treat Number As. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

    Read the article

  • How to secure Add child record functionality in MVC on Parent's view?

    - by RSolberg
    I'm trying to avoid some potential security issues as I expose some a new set of functionality into the real world. This is basically functionality that will allow for a new comment to be added via a partialview on the "Parent" page. My comment needs to know a couple of things, first what record is the comment for and secondly who is making the comment. I really don't like using a hidden field to store the ID for the Parent record in the add comment form as that can be easily changed with some DOM mods. How should I handle this? PARENT <% Html.RenderPartial("AddComment", Model.Comments); %> CHILD <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<CommentsViewModel>" %> <% using (Html.BeginForm("AddComment", "Requests")) {%> <fieldset> <legend>New Comment</legend> <%= Html.HiddenFor(p => p.RequestID) %> <%= Html.TextBoxFor(p => p.Text) %> &nbsp; <input type="submit" value="Add" /> </fieldset> <% } %> CONTROLLER [AcceptVerbs(HttpVerbs.Post)] public void AddComment(CommentsViewModel commentsModel) { var user = GetCurrentUser(); commentsModel.CreatedByID = user.UserID; RequestsService.AddComment(commentsModel); }

    Read the article

  • how send data record using SendMessage(..) in separate process

    - by XBasic3000
    i use to send a data on two separate process but it fails. it works only under same process... this is concept. //----------------------------------------------------------------------------------- MainApps //----------------------------------------------------------------------------------- Type PMyrec = ^TMyrec; TMyrec = Record name : string; add : string; age : integer; end; :OnButtonSend var aData : PMyrec; begin new(aData); aData.Name := 'MyName'; aData.Add := 'My Address'; aData.Age : 18; SendMessage(FindWindow('SubApps'),WM_MyMessage,0,Integer(@aData)); end; //----------------------------------------------------------------------------------- SubApps //----------------------------------------------------------------------------------- Type PMyrec = ^TMyrec; TMyrec = Record name : string; add : string; age : integer; end; :OnCaptureMessage var aData : PMyrec; begin aData := PMyrec(Msg.LParam); showmessage(aData^.Name); end;

    Read the article

  • In Wireshark's Protocol Hierarchy Statistics screen, is the total byte count of a capture the sum of the Bytes column or just the top line (Frame)?

    - by Howiecamp
    Part 1 - I'm looking at Wireshark's Protocol Hierarchy Statistics screen (sample below), is the total byte count of the capture the sum of the Bytes column or just the top line (Frame)? I'm 99% that it's the latter because of protocol rollup but I wanted to conform. Part 2 - From Wireshark documentation on this screen, "Protocol layers can consist of packets that won't contain any higher layer protocol, so the sum of all higher layer packets may not sum up to the protocols packet count. Example: In the screenshot TCP has 85,83% but the sum of the subprotocols (HTTP, ...) is much less. This may be caused by TCP protocol overhead, e.g. TCP ACK packets won't be counted as packets of the higher layer)." Can you explain this?

    Read the article

  • HD Tune warning for "Reallocated Event Count" with a new/unused drive. How serious is that?

    - by Developer Art
    I've just looked at the health status of my old 2,5 inch 500 Gb Fujitsu drive with a popular "HD Tune" utility. It shows a warning for the "Reallocated Event Count" property. How serious is that? The thing is that the drive is practically new. I pulled it out of a new laptop over a year ago and never used it since. Right now it only has 53 "Power On" hours which sounds about right since I only had it running a few evenings overnight before switching it for something more performant. Does this warning indicate that the drive is likely to fail some time in the future? I'm somewhat perplexed since the drive is effectively unused. What is more, I have arranged with somebody to buy off this drive since I don't really need. It is 12,5 mm thick (with 3 plates) meaning it doesn't fit into an external enclosure which makes it quite useless to me. Can I give away the drive without having it on my conscience or better cancel the deal? In other words, can the drive be used safely for years to come or better throw it away? I'm running a sector test now to see if there are any real problems. Will post the results as soon as they're available.

    Read the article

  • Warning for "Reallocated Event Count" S.M.A.R.T. attribute with a new/unused drive. How serious is t

    - by Developer Art
    I've just looked at the health status of my old 2,5 inch 500 Gb Fujitsu drive with a popular "HD Tune" utility. It shows a warning for the "Reallocated Event Count" property. How serious is that? The thing is that the drive is practically new. I pulled it out of a new laptop over a year ago and never used it since. Right now it only has 53 "Power On" hours which sounds about right since I only had it running a few evenings overnight before switching it for something more performant. Does this warning indicate that the drive is likely to fail some time in the future? I'm somewhat perplexed since the drive is effectively unused. What is more, I have arranged with somebody to buy off this drive since I don't really need. It is 12,5 mm thick (with 3 plates) meaning it doesn't fit into an external enclosure which makes it quite useless to me. Can I give away the drive without having it on my conscience or better cancel the deal? In other words, can the drive be used safely for years to come or better throw it away? I'm running a sector test now to see if there are any real problems. Will post the results as soon as they're available.

    Read the article

  • Read ini from windows batch file

    - by Hintswen
    I'm trying to read a ini file with this format: [SectionName] total=4 [AnotherSectionName] total=7 [OtherSectionName] total=12 Basically I want to echo out certain values from the ini file(eg. the total under OtherSectionName followed by the total from AnotherSectionName).

    Read the article

  • Get 2 recoeds from PLSQL Cursor

    - by Hany
    Hi Guys... my Question is short, I create a cursor to get some values from my table I want to get the current record of cursor (the fetched record) and the next record from the cursor without fetching it, cause I want to make a calculation over the current record and the next record. in traditional Programming it's a simple operation you can do it with for index by increasing it by 1. do you have any suggestion :) thx

    Read the article

  • how to do gedcom import with minimal database roundtrip. what is best practice for this kind of dev

    - by Radhi
    In My current application, I need to import users from gedcom file. these users may exist in my registered users or i need to create one registered user for the same. now gedcom file contain s many information e.g. PersonalDetails,Addresses, Education Details, ProfessionalDetails this is one sample of xml file we are storing to store user's profile. <UserProfile xmlns=""> <BasicInfo> <Title value="Basic Details" /> <Fields> <UserId title="UserId" right="Public" value="151" /> <EmailAddress title="Email Address" right="CUG" value="[email protected]" /> <FirstName title="First Name" right="Public" value="Anju" /> <LastName title="Last Name" right="Public" value="Trivedi" /> <DisplayName title="Display Name" right="Private" value="Anju" /> <RegistrationStatusId title="RegistrationStatusId" right="Public" value="10" /> <RegistrationStatus title="Registration Status" right="Private" value="Registered" /> <CityId title="CityId" right="Private" value="19" /> <CityName title="City" right="Public" value="Delhi" /> <StateId title="StateId" right="Private" value="69" /> <StateName title="State" right="Public" value="Delhi" /> <CountryId title="CountryId" right="Private" value="109" /> <CountryName title="Country" right="Public" value="India" /> <Gender title="Gender" right="Private" value="Male" /> <CreatedBy title="CreatedBy" right="Public" value="0" /> <CreatedOn title="CreatedOn" right="Public" value="Nov 27 2009 3:08PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="Mar 3 2010 6:56PM " /> <LogInStatusId title="LogInStatusId" right="Public" value="1" /> <LogInStatus title="LogIn Status" right="Private" value="Free" /> <ProfileImagePath title="Profile Pic" right="Public" value="~/Images/13_HolidayBarbie07CL2010427143129.jpg" /> <ProfileThumbnailPath title="Profile Thumbnail" right="Public" value="~/Images/Thumb13_HolidayBarbie07CL2010427143129.jpg" /> </Fields> </BasicInfo> <PersonalInfo> <Title value="Personal Details" /> <Fields> <Nickname title="Nick Name" right="Public" value="Anju" /> <NativeLocation title="Native" right="Public" value="Mehsana" /> <DateofAnniversary title="Anniversary Dt." right="Private" value="4/1/2010" /> <BloodGroupId title="BloodGroupId" right="Public" value="24" /> <BloodGroupName title="Blood Group" right="Public" value="A+" /> <MaritalStatusId title="MaritalStatusId" right="Private" value="35" /> <MaritalStatusName title="Marital status" right="Private" value="UnMarried" /> <DateofDeath title="Death dt" right="Private" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="4/27/2010 2:32:07 PM" /> <DateOfBirth title="Birth Date" value="" right="CUG" /> <BirthPlace title="Birth Place" value="Jaipur" right="Private" /> </Fields> </PersonalInfo> <FamilyInfo> <Title value="Family Details" /> <Fields> <GallantryHistory title="Gallantry History" right="Public" value="Anjli History" /> <Ethinicity title="Ethinicity" right="Public" value="Indian" /> <KulDev title="KulDev" right="Public" value="Krishna" /> <KulDevi title="KulDevi" right="Public" value="Lakhsmi" /> <Caste title="Caste" right="Private" value="Vaishnav" /> <SunSignId title="SunSignId" right="Public" value="15" /> <SunSignName title="SunSignName" right="Public" value="Gemini" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Dec 11 2009 12:00AM " /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="Dec 11 2009 12:00AM " /> </Fields> </FamilyInfo> <HobbyInfo> <Title value="Hobbies/Interests" /> <Fields> <AbountMe title="Abount Me" right="Public" value="" /> <Hobbies title="Hobbies" right="Public" value="" /> <Food title="Food" right="Public" value="" /> <Movies title="Movies" right="Public" value="" /> <Music title="Music" right="Public" value="" /> <TVShows title="TV Shows" right="Public" value="" /> <Books title="Books" right="Public" value="" /> <Sports title="Sports" right="Public" value="" /> <Will title="Will" right="Public" value="" /> <FavouriteQuotes title="Favourite Quotes" right="Public" value="" /> <CremationPrefernces title="Cremation Prefernces" right="Public" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="" /> <ModifiedOn title="ModifiedOn" right="Public" value="" /> </Fields> </HobbyInfo> <PermenantAddr> <Title value="Permenant Address" /> <Fields> <Address title="Address" right="Public" value="Select" /> <CityId title="CityId" right="Public" value="116" /> <CityName title="City" right="Public" value="Iran" /> <StateId title="StateId" right="Public" value="95" /> <StateName title="State" right="Public" value="Iran" /> <CountryId title="CountryId" right="Public" value="7" /> <CountryName title="Country" right="Public" value="Afghanistan" /> <ZipCode title="ZipCode" right="Private" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="4/6/2010 10:48:39 AM" /> </Fields> </PermenantAddr> <PresentAddr> <Title value="Present Address" /> <Fields> <Address title="Address" right="Public" value="Select" /> <CityId title="CityId" right="Public" value="1" /> <CityName title="City" right="Public" value="Select" /> <StateId title="StateId" right="Public" value="1" /> <StateName title="State" right="Public" value="Select" /> <CountryId title="CountryId" right="Public" value="1" /> <CountryName title="Country" right="Public" value="Select" /> <ZipCode title="ZipCode" right="Private" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="" /> <ModifiedOn title="ModifiedOn" right="Public" value="" /> </Fields> </PresentAddr> <ContactInfo> <Title value="Contact Details" /> <Fields> <DayPhoneNo title="Day Phone" right="Public" value="" /> <NightPhoneNo title="Night Phone" right="Public" value="" /> <MobileNo title="Mobile No" right="Private" value="" /> <FaxNo title="Fax No" right="CUG" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="" /> <ModifiedOn title="ModifiedOn" right="Public" value="" /> </Fields> </ContactInfo> <EmailInfo> <Title value="Alternate Email Addresses" /> <Fields> <Record right="Public"> <Id title="Id" right="Public" value="3" /> <Provider title="Provider" right="Public" value="google" /> <EmailAddress title="Email Address" right="Public" value="[email protected]" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Mar 3 2010 10:17AM " /> <ModifiedBy title="ModifiedBy" right="Public" value="0" /> <ModifiedOn title="ModifiedOn" right="Public" value=" " /> </Record> <Record right="Public"> <Id title="Id" right="Public" value="4" /> <Provider title="Provider" right="Public" value="Yahoo" /> <EmailAddress title="Email Address" right="Public" value="[email protected]" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Mar 3 2010 6:53PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="0" /> <ModifiedOn title="ModifiedOn" right="Public" value=" " /> </Record> <Record right="Private"> <Provider value="111" right="Private" /> <EmailAddress value="[email protected]" right="Private" /> <Id value="5" /> <IsActive value="true" right="Private" /> <ModifiedBy value="13" right="Private" /> <ModifiedOn value="Tuesday, March 16, 2010" right="Private" /> </Record> </Fields> </EmailInfo> <AcademicInfo> <Title value="Education Details" /> <Fields> <Record right="Public"> <Id title="Id" right="Public" value="0" /> <Education title="Education" right="Public" value="" /> <Institute title="Institute" right="Public" value="" /> <PassingYear title="Passing Year" right="Public" value="" /> <IsActive title="IsActive" right="Public" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="" /> <ModifiedOn title="ModifiedOn" right="Public" value="" /> </Record> </Fields> </AcademicInfo> <AchievementInfo> <Title value="Achievement Details" /> <Fields> <Record right="Public"> <Id title="Id" right="Public" value="0" /> <Awards title="Award" right="Public" value="" /> <FieldOfAward title="Field Of Award" right="Public" value="" /> <Tournament title="Tournament" right="Public" value="" /> <AwardDescription title="Description" right="Public" value="" /> <AwardYear title="Award Year" right="Public" value="" /> <IsActive title="IsActive" right="Public" value="" /> <CreatedBy title="CreatedBy" right="Public" value="" /> <CreatedOn title="CreatedOn" right="Public" value="" /> <ModifiedBy title="ModifiedBy" right="Public" value="" /> <ModifiedOn title="ModifiedOn" right="Public" value="" /> </Record> </Fields> </AchievementInfo> <ProfessionalInfo> <Title value="Professional Details" /> <Fields> <Record right="Public"> <Id title="Id" right="Public" value="4" /> <Occupation title="Occupation" right="Public" value="a" /> <Organization title="Organization" right="Public" value="a" /> <ProjectsDescription title="Description" right="Public" value="a" /> <Duration title="Duration" right="Public" value="2" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Jan 7 2010 1:14PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="Jan 7 2010 1:14PM " /> </Record> <Record right="Public"> <Id title="Id" right="Public" value="5" /> <Occupation title="Occupation" right="Public" value="ab" /> <Organization title="Organization" right="Public" value="zsd" /> <ProjectsDescription title="Description" right="Public" value="sd" /> <Duration title="Duration" right="Public" value="5" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Jan 7 2010 1:15PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="13" /> <ModifiedOn title="ModifiedOn" right="Public" value="Jan 7 2010 1:15PM " /> </Record> <Record right="Public"> <Id title="Id" right="Public" value="8" /> <Occupation title="Occupation" right="Public" value="fgdf" /> <Organization title="Organization" right="Public" value="gdfg" /> <ProjectsDescription title="Description" right="Public" value="dfgdf" /> <Duration title="Duration" right="Public" value="12" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Feb 22 2010 5:07PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="0" /> <ModifiedOn title="ModifiedOn" right="Public" value="Jan 1 1900 12:00AM " /> </Record> <Record right="Public"> <Id title="Id" right="Public" value="9" /> <Occupation title="Occupation" right="Public" value="fgdf" /> <Organization title="Organization" right="Public" value="gdfg" /> <ProjectsDescription title="Description" right="Public" value="dfgdf" /> <Duration title="Duration" right="Public" value="12" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Feb 22 2010 5:11PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="0" /> <ModifiedOn title="ModifiedOn" right="Public" value="Jan 1 1900 12:00AM " /> </Record> <Record right="Public"> <Id title="Id" right="Public" value="10" /> <Occupation title="Occupation" right="Public" value="fgdf" /> <Organization title="Organization" right="Public" value="gdfg" /> <ProjectsDescription title="Description" right="Public" value="dfgdf" /> <Duration title="Duration" right="Public" value="12" /> <IsActive title="IsActive" right="Public" value="false" /> <CreatedBy title="CreatedBy" right="Public" value="13" /> <CreatedOn title="CreatedOn" right="Public" value="Feb 22 2010 5:13PM " /> <ModifiedBy title="ModifiedBy" right="Public" value="0" /> <ModifiedOn title="ModifiedOn" right="Public" value="Jan 1 1900 12:00AM " /> </Record> </Fields> </ProfessionalInfo> <SecuritySettings> <AlbumRights> <Create value="Private" /> <View value="CUG" /> <Edit value="CUG" /> <Delete value="CUG" /> <PostComments value="CUG" /> <AddToAlbum value="CUG" /> </AlbumRights> <ImageRights> <Create value="Private" /> <View value="CUG" /> <Edit value="CUG" /> <Delete value="CUG" /> <PostComments value="Private" /> </ImageRights> </SecuritySettings> </UserProfile> now when i am importing data from gedcom, i am creating one person object which contains all this info. but before i insert it itodatabase have to check if userid exist for the emailaddress dont update data else create a user and update its profilexml from data fecthed from gedcom. for this i think i need some soln by which i can do only one roundtrip to database and can update all user's xml. or i can execute one sp to get userid from all users where i'll check if user exist then return userid else insert basic data and return inserted userid then for every user make xml from data and update it. please provide be suggestion what is best practice to do this kind of development. if need any more details please write me

    Read the article

  • Get two records from PLSQL cursor

    - by Hany
    My question is short. I create a cursor to get some values from my table. I want to get the current record of cursor (the fetched record) and the next record from the cursor without fetching it, because I want to make a calculation over the current record and the next record. In traditional programming it's a simple operation; you can do it with a for index by increasing it by 1. Do you have any suggestions?

    Read the article

  • Access crosstab formula field in another crosstab column?

    - by Damien Joe
    How to access crosstab formula field in another column? I have report like with Dues & total both formula fields: Amount Dues(Done by a Formula) Total (Done by a Formula) ------ ------------------------- --------------------------- 500 20 % someAmount Formula for Dues: WhileReadingRecords; numberVar due:={Command.SomeField)/100; due Formula for Total: WhileReadingRecords; numberVar total:= {Command.Amount} - due; total How do I access due field inside the second formula for each row of record?

    Read the article

  • Linq, how to specify timestamp condition?

    - by 5YrsLaterDBA
    I have a logins table which records all login and logout activities. I want to look at all login activities of a particular user within 24 hrs. how to do it? something like this: var records = from record in db.Logins where record.Users.UserId == userId && record.Timestamp <= (DateTime.Now + 24) select record; record.Timestamp <= (DateTime.Now + 24) is wrong here. I am using C# 3 + L2E.

    Read the article

  • multi-thread access MySQL error

    - by user188916
    I have written a simple multi-threaded C program to access MySQL,it works fine except when i add usleep() or sleep() function in each thread function. i created two pthreads in the main method, int main(){ mysql_library_init(0,NULL,NULL); printf("Hello world!\n"); init_pool(&p,100); pthread_t producer; pthread_t consumer_1; pthread_t consumer_2; pthread_create(&producer,NULL,produce_fun,NULL); pthread_create(&consumer_1,NULL,consume_fun,NULL); pthread_create(&consumer_2,NULL,consume_fun,NULL); mysql_library_end(); } void * produce_fun(void *arg){ pthread_detach(pthread_self()); //procedure while(1){ usleep(500000); printf("producer...\n"); produce(&p,cnt++); } pthread_exit(NULL); } void * consume_fun(void *arg){ pthread_detach(pthread_self()); MYSQL db; MYSQL *ptr_db=mysql_init(&db); mysql_real_connect(); //procedure while(1){ usleep(1000000); printf("consumer..."); int item=consume(&p); addRecord_d(ptr_db,"test",item); } mysql_thread_end(); pthread_exit(NULL); } void addRecord_d(MYSQL *ptr_db,const char *t_name,int item){ char query_buffer[100]; sprintf(query_buffer,"insert into %s values(0,%d)",t_name,item); //pthread_mutex_lock(&db_t_lock); int ret=mysql_query(ptr_db,query_buffer); if(ret){ fprintf(stderr,"%s%s\n","cannot add record to ",t_name); return; } unsigned long long update_id=mysql_insert_id(ptr_db); // pthread_mutex_unlock(&db_t_lock); printf("add record (%llu,%d) ok.",update_id,item); } the program output errors like: [Thread debugging using libthread_db enabled] [New Thread 0xb7ae3b70 (LWP 7712)] Hello world! [New Thread 0xb72d6b70 (LWP 7713)] [New Thread 0xb6ad5b70 (LWP 7714)] [New Thread 0xb62d4b70 (LWP 7715)] [Thread 0xb7ae3b70 (LWP 7712) exited] producer... producer... consumer...consumer...add record (31441,0) ok.add record (31442,1) ok.producer... producer... consumer...consumer...add record (31443,2) ok.add record (31444,3) ok.producer... producer... consumer...consumer...add record (31445,4) ok.add record (31446,5) ok.producer... producer... consumer...consumer...add record (31447,6) ok.add record (31448,7) ok.producer... Error in my_thread_global_end(): 2 threads didn't exit [Thread 0xb72d6b70 (LWP 7713) exited] [Thread 0xb6ad5b70 (LWP 7714) exited] [Thread 0xb62d4b70 (LWP 7715) exited] Program exited normally. and when i add pthread_mutex_lock in function addRecord_d,the error still exists. So what exactly the problem is?

    Read the article

  • undefined reference errors in C++

    - by Phenom
    I have files Record.h and Record.cpp. When I just include the Record.h file, I get several undefined reference errors to functions defined in those files. When I also include Record.cpp then the errors go away. Why is that? Record.h has the forward declarations for the functions it says are an undefined reference.

    Read the article

  • MySQL Order By Problem, Why is 1000 being seen as Smaller than 2?

    - by Jack
    I have a strange problem, I am trying to order the output of a set of records by a field called displayOrder. Now even though record A has a displayOrder of 2 and record B has a displayOrder of 1000, record B still shows up before record A. Here's my select statement: SELECT * FROM items ORDER BY displayOrder ASC It works fine until I have a record greater than 9, then 10, 11, 12, etc are seen as smaller than 2, 3, 4 because they start with the number 1. Any way to fox this?

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >