Search Results

Search found 8893 results on 356 pages for 'stored'.

Page 217/356 | < Previous Page | 213 214 215 216 217 218 219 220 221 222 223 224  | Next Page >

  • magic records being deleted

    - by chris
    i have customised oscommerce to pull in a csv file of products, delete anything thats not with an image/proper description/proper title gets removed. The import runs on a cron job basis pulling information from a supplier, it hasnt run since yesterday but a product has disappeared- Anyone who has used oscommerce will know that, product information is stored over multiple tables. example is- products product_description and so on. the thing that has got me that the information is deleted from the product table but not from the product_description table. The product that is being deleted is a manually input one which carries a special tag/prefix on the model item of the product table. Therefore shouldn't be touched at all. Am clueles as what is going on. Is there mysql integrity checks deleting records? could there be another plugin working on oscommerce?

    Read the article

  • Static struct in C++

    - by pingvinus
    Hi, I want to define an structure, where some math constants would be stored. Here what I've got now: struct consts { //salt density kg/m3 static const double gamma; }; const double consts::gamma = 2350; It works fine, but there would be more than 10 floating point constants, so I doesn't want to wrote 'static const' before each of them. And define something like that: static const struct consts { //salt density kg/m3 double gamma; }; const double consts::gamma = 2350; It look fine, but I got these errors: 1. member function redeclaration not allowed 2. a nonstatic data member may not be defined outside its class I wondering if there any C++ way to do it?

    Read the article

  • Do you use enums in your data applications and how?

    - by Ivan
    My practice shows that a general enterprise application has a lot of entities which's nature corresponds to an elementary enumeration. For example We may have an Order entity which may have such fields as "OrderType", "OrderStatus", "Currency", etc. referencing corresponding Entities which are nothing more than just a textual name bound to a key to be referenced. Using enums would look very natural here. But entities have to be defined in application code at design-time, am I right? While in we need to be able to CRUD enum value variants at runtime and use enums in server-side SQL queries (like stored procedures and views). What are you practices and thoughts on this subject? I am particularly interested in C#4, linq and T-SQL.

    Read the article

  • Is a base class with shared fields and functions good design

    - by eych
    I've got a BaseDataClass with shared fields and functions Protected Shared dbase as SqlDatabase Protected Shared dbCommand as DBCommand ... //also have a sync object used by the derived classes for Synclock'ing Protected Shared ReadOnly syncObj As Object = New Object() Protected Shared Sub Init() //initializes fields, sets connections Protected Shared Sub CleanAll() //closes connections, disposes, etc. I have several classes that derive from this base class. The derived classes have all Shared functions that can be called directly from the BLL with no instantiation. The functions in these derived classes call the base Init(), call their specific stored procs, call the base CleanAll() and then return the results. So if I have 5 derived classes with 10 functions each, totaling 50 possible function calls, since they are all Shared, the CLR only calls one at a time, right? All calls are queued to wait until each Shared function completes. Is there a better design with having Shared functions in your DAL and still have base class functions? Or since I have a base class, is it better to move towards instance methods within the DAL?

    Read the article

  • Display field from another table in SQL

    - by Roland Bengtsson
    I'm a newbie with SQL... Now I want to do display some instances of AddrDistances from DevExpress CxGrid with SQL. Select Cast((DistanceAsMeters * 0.001) as Decimal(8,1)) DistanceAsKm, bold_id, created, fromAddress, toAddress From AddrDistance Where DistanceAsMeters = 0 and PseudoDistanceAsCostKm = 0 and not AddrDistance.bold_id in (select bold_id from DistanceQueryTask) Order By Created Desc This SQL is working and the result is: DistanceAsKM Bold_ID Created FromAddress ToAddress 0 134808 16.02.2010 121795 134570 0 121701 10.03.2010 120850 122991 The result I want is this: DistanceAsKM Bold_ID Created FromAddress ToAddress 0 134808 16.02.2010 Kalmar Stockholm 0 121701 10.03.2010 Falkenberg Oslo So the amount of rows is right but I want to replace the numbers in FromAddress and ToAddress with strings from another table. The numbers shows here is just the boldid. Every object in the database have an unique boldid. The addresses above is stored in table Address and it have a City field with the column and a boldid as a key. What should I write in SQL to get this right ? Is there something in the CxGrid that could help here ? Regards

    Read the article

  • scanf stop reading when eol is seen

    - by gcc
    scanf("%[^\n]\n",A[i]); /*that reads input line by line; example:first line is stored in A[0]*/ -> but I want read each element of line and send to struct fuction until the EOL (end of line) -> explaining: in current line,read one data ,then send to struct funcion to hold,after then ,in for loop, read next data decide it is float then send it to function. when eol is read, then activate next struct. >question is I want write something in scanf such that I stop read when i see eol. can I do for( ; ; ) { scanf("...",Sam); if(Sam=='\n) break; }

    Read the article

  • Mixing Transaction Script pattern with DDD/CQRS

    - by Herman
    Hi all, Here is the situation, in order to support our legacy system, we need to insert to a table whenever a user logs in. This is basically an CRUD operation, so it doesn't really make sense to create repository/entity/command/event for this since this doesn't tie to any business rules at all. The only benefit to create a CQRS command is that this database write can happen asynchronously under that model. Which is a better route to take? Use CQRS, and then call a stored proc. when handling that command? Just call database directly in the controller (I am using asp.net mvc)

    Read the article

  • T-SQL: Build Nested Set From Parent-Child Relationship

    - by Peder Rice
    I have a table that stores my Customer hierarchy with a nested set (due to the specific design of the application, I wasn't able to leverage just a Customer/Parent Customer mapping table). To simplify maintenance of this table, I've built a couple of stored procedures to handle moving nodes around and creating new nodes, but it's significantly more work than maintaining a Customer/Parent Customer table. Further, these structures are very fragile. So I'm looking for a way to have a Customer/Parent Customer table and then convert that table to a nested set on demand. Does anyone have a link to such an implementation?

    Read the article

  • Insert string from database into an aspx and have databinding expressions within that text evaluated

    - by Christopher Edwards
    Well, as you can see I want something quick-and-dirty! How can I get a string from a db that has aspx databinding syntax in it and have the databinding expressions evaluated? So I have text such as this stored in the DB:- Hello <%=User.Name %> I haven't seen you since <%=User.LastVisit %> And I want it to be inserted here say:- ... <body> <form id="form1" runat="server"> <div> <asp:FormView ID="FormView1" DataSourceID="DataSource1" runat="server"> <ItemTemplate> <-- insert stuff here! --> ... But with the databinding expressions evaluated. I'd rather not build a full parsing, templating and evaluation engine...

    Read the article

  • Generate markdown from source code doc-comments

    - by erikkallen
    Is there any easy way to generate markdown from source code comments? I'm looking for something similar to Sandcastle, but with Markdown (or something like it) as output. The full story is that I use DokuWiki to generate documentation. It has many nice properties, including that all its data is stored in text files. So my idea is that I should somehow be able to get my API docs into the same place as the rest of my documentation. So far my best bet seems to be to write a custom documenter for NDoc3, but perhaps someone has already done something similar which would save me the trouble. I realize I might need to tweak it a little due to slight syntax differences.

    Read the article

  • Image Data via Ajax - how can I display the image on the Page

    - by Mike B
    I am creating a Domino Document via AJAX that contains a photo. I am able to get the base64 image data back to the server in a Notes Domino Document. Data is stored in a Richtext (textarea) field as "data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAFA..........." - (this goes on for several lines) I am trying to display on the Domino Webpage using passthru tag <<image id= "pic1" >> in the onLoad event of the Form i try to shove the data into the image element using this code: //Photo Stuff alert(document.forms[0].photo1.value); document.getElementById("pic1").src = document.forms[0].photo1.value; The alert is showing the data. Picture is not appearing. Please help. Thanks Mike

    Read the article

  • Binding key/value pairs loaded from xml

    - by Hichem
    I want to load key/values configuration pairs stored in XML file. To bind a collection of data i know i need to use the ArrayList class, but the problem is that i want to be able to bind the loaded values using their corresponding keys and not by their indexes in the ArrayList object. For example i want to be able to do this : <mx:Text id="errorText" text="{Config.params['someKey']}" /> instead of : <mx:Text id="errorText" text="{Config.params[0]}" /> where Config.params is ArrayList (obviously i couldn't use ArrayList since it doesn't allow selecting a value by key) So the question is how to bind the key/value pairs loaded form XML. I don't want to go manually set variables, i want to bind them so when they are loaded the are set automatically. Did anyone had to do something like that ?

    Read the article

  • ORM framework that extends base class with database-implementation.

    - by aioobe
    I have a game consisting of a client / server + a webpage. A central notion in both client and game-/webserver is an Account. Accounts are stored in a database thus I'm in need of some ORM and recently had a look at Hibernate and Cayenne. My understanding however, is that both frameworks provide an "DatabaseBackedAccount"-class which I extend with my other Account methods. My problem is that the Account class is reused heavily on the client side, and I would obviously not want to include database-related code on the client implementation. My current solution is to have an Account class (shared by server and client) and extend this with a DatabaseBackedAccount (overriding setter-methods and providing a commit method) on the server side. I find this quite natural and nice, however I've had to implement all gory sql-details and ORM myself. Is there any way to "turn the table" in any existing ORM framework, so that the generated classes extend my existing class?

    Read the article

  • Is a .Net membership database portable, or are accounts somehow bound to the originating Web site or

    - by Deane
    I have an ASP.Net Web site using .Net Membership with a SQL Server provider, so the users and roles are stored in the SQL tables created by Aspnet_regsql.exe. Is this architecture totally self-contained and portable, or are users in it somehow bound to the specific Web site on which they create their account? Put another way, if we create a bunch of users in dev or UAT, the back up and restore this database to another server, accessed under another domain name, should it still work just fine? We're seeing some odd behavior when we move the database, like users losing group affiliation and such, and I'm curious how portable and environment-agnostic this database really is. I have a sneaking suspicion that something is bound to the machine key or the domain.

    Read the article

  • aspnet_regsql not working at all

    - by user252160
    I would like to create the ASP.NET User database template on a database of my own, because I'd like to fully untegrate the user system with the rest of my DB. As I've read, i needed to use the aspnet_regsql tool. I put all the options (because my database is running on SQLEXPRESS and is in an mdf file in my project's folder). the program starts and seemingly runs without any errors, however, when I open the database after that, not tables or stored procedures have been added. One more thing: I did one more test. I intentionally gave the -d option a wrong mdf file address, and surprisingly, the program "finished" correctly, yet no file was crated or modified whatsoever.

    Read the article

  • Issue using Session in MVC Actions with [authorice]

    - by Pablo Gonzalez
    Hi all, first of all sorry for my poor English! When I use the [Authorice` attribute i can't get Session data that i stored before. For example: public ViewResult Index() { // do some stuffs Session["Test"] = "Hi stackoverflow!"; } And then i try to get it in another action, but with the [Authorize] attibute [Authorize] public ViewResult Test() { // do some stuffs if(Session["Test"] == null) { //do some stuffs } } Session["Test"] is always null, but if i remove the attribute it's work, may anyone help me?, thanks a lot!!! P.S: I instance Session["Test"] in Session_Start

    Read the article

  • Syncronize an SVN repo (svnsync) with encoding errors

    - by Hamish
    Is it possible to fix/bypass non-UTF8 encoded svn:log records when syncronizing repositories with svnsync? Background I'm in the process of taking over the maintenance of an open source module that is stored within a large (well over 10,000 revisions) subversion (1.5.5) repository. I do not have admin access to the remote repository to dump/filter/load the module. The old repository is being discontinued and I am trying to sync the original sub module to my local (1.6+) repository with svnsync. For example: svnsync file://home/svn/temp-repo/ http://path.to.repo/modulename/ The problem is that the old repository didn't enforce UTF8 encoding and I'm hitting errors like: svnsync: Cannot accept 'svn:log' property because it is not encoded in UTF-8 I can't modify the log property in the source repository so I need to somehow modify or ignore the property value when the encoding is unknown/invalid. Any ideas? For example, is it possible to write a pre-revprop-change script to modify the log property in transit?

    Read the article

  • Magento: Syncing product/category translations from "Store 1 > German Store View" to "Store 2 > German Store View"

    - by mattalexx
    For two different store Views using the same locale, it is easy to manage translations for miscellaneous text that's stored in CSV files. It's just a question of configuring the locale correctly so the correct CSV files will be used. But my client has entered a bunch of translations for products and categories into the admin by changing the scope to "Store 1 German" and setting the translations. But now he has Store 2, with a German store view. How does he keep "Store 1 German Store View" and "Store 2 German Store View" in sync?

    Read the article

  • How can you configure or extend BITS (Background Intelligent Transfer Service) to read files from a

    - by Mark
    I have a ASP .NET load balanced application (webservice and website). It runs on SQL server. I need to be able to provide large files for download. However, because of the load balancing situation, the files are stored in the SQL database as opposed to the file system. BITS seems to be the best approach. I have full control of the client. However, i don't know how to configure BITS to read the file from the database. I know how to write the C# code for that, but i don't know how to get BITS to hook into it as opposed to reading the file from the file system. Any ideas?

    Read the article

  • Is there alternative way to write this query?

    - by Kugel
    I have tables A, B, C, where A represents items which can have zero or more sub-items stored in C. B table only has 2 foreign keys to connect A and C. I have this sql query: select * from A where not exists (select * from B natural join C where B.id = A.id and C.value > 10); Which says: "Give me every item from table A where all sub-items have value less than 10. Is there a way to optimize this? And is there a way to write this not using exists operator?

    Read the article

  • Track/Display Array Index As Part Of Cout (C++)

    - by John Smith
    Hi, I have a command line C++ program that lets you enter basic information about a person (ID number, name, age, etc.) and I want to output to a console in the following manner: ------------------------------------------------------------------- Index ID # First Name Last Name Age ------------------------------------------------------------------- 0 1234 John Smith 25 The person objects are stored in an array of Persons and I've overload the ostream (<<) operator to print out all of the fields like you see. The dashed lines and header come from a displayHdg() function. Anyhow, I have not been able to figure out how to get the proper index value for the array. Ideally, I'd like to generate the indices for each line, but all my attempts have failed. The array is looped through and each object printed in the main() function, and the ostream is overloaded in a person class, so I tried to use global variables as well as static variables, and all of those produce incorrect numbering (i.e. show 0, 1 the first time (for 2 objects), then change to 1, 2 on the next display). Any ideas?

    Read the article

  • Adding custom/new properties to any file regardless of type and extension e.g. setting 'Author' on a

    - by Vaibhav Garg
    I want the ability add properties and tags to a file (specifically ebook files and ebook related properties in Windows 7 but interested to go so for as many OSes as possible) For e.g. Example.txt or Example.doc or Example.epub should all store and carry properties like 'Author', 'Publication date', 'Tags' etc.. the properties should be stored with the file itself. Such that if it is transferred to another system it retains the properties (even if i need to install 'my app' to support this function on the other machine) How do I make this possible using .net (preferred) and what file system concepts should I learn to understand the underlying concepts and limitations to be able to implement this feature? Any application that already does this? Thank you

    Read the article

  • Place an object on top of stack in ILGenerator

    - by KiNGPiN
    I have to pass a function an instance of an object, so obviously all the information to be taken as argument is to be loaded onto the evaluation stack Here is the code that i am looking for someClass SomeObject = new someClass(); il.Emit(OpCodes.LoadObject, SomeObject); il.Emit(OpCodes.CallVirt, MethodInfo Function); public void Function(Object obj) { Type type = typeof(obj); //do something w.r.t to the type } I dont require any information stored in the class just the type and i cannot use any of the primitive types to take my decision on Last i read that i can use a pointer to load the type using some opcodes ... but i am completely lost here, any help or pointers to the right direction would be great :)

    Read the article

  • Export GridView to TXT, then upload file to server

    Basically what I want to do is export an array (or GridView) to a file called "getpathin.dat". That is easy, but the method I am using downloads the file to my computer, which is what I don't want. I want to write an array to either a PRE-EXISTING file that is on the server OR create a new file on the server in a folder, and this new file will contain either the array or the gridview, which I have stored in the array. i'll be doing this in... Visual Studio 2008/SQL Server using C#

    Read the article

  • SQL Server 2008 BULK INSERT causes more reads than writes. Why?

    - by sh1ng
    I've huge a table (a few billion rows) with a clustered index and two non-clustered indices. A BULK INSERT operation produces 112000 reads and only 383 writes (duration 19948ms). It's very confusing to me. Why do reads exceed writes? How can I reduce it? update query insert bulk DenormalizedPrice4 ([DP_ID] BigInt, [DP_CountryID] Int, [DP_OperatorID] SmallInt, [DP_OperatorPriceID] BigInt, [DP_SpoID] Int, [DP_TourTypeID] Int, [DP_CheckinDate] Date, [DP_CurrencyID] SmallInt, [DP_Cost] Decimal(9,2), [DP_FirstCityID] Int, [DP_FirstHotelID] Int, [DP_FirstBuildingID] Int, [DP_FirstHotelGlobalStarID] Int, [DP_FirstHotelGlobalMealID] Int, [DP_FirstHotelAccommodationTypeID] Int, [DP_FirstHotelRoomCategoryID] Int, [DP_FirstHotelRoomTypeID] Int, [DP_Days] TinyInt, [DP_Nights] TinyInt, [DP_ChildrenCount] TinyInt, [DP_AdultsCount] TinyInt, [DP_TariffID] Int, [DP_DepartureCityID] Int, [DP_DateCreated] SmallDateTime, [DP_DateDenormalized] SmallDateTime, [DP_IsHide] Bit, [DP_FirstHotelAccommodationID] Int) with (CHECK_CONSTRAINTS) No triggers & foreign keys Cluster Index by DP_ID and two non-unique indexes(with fillfactor=90%) And one more thing DB stored on RAID50 with stripe size 256K

    Read the article

< Previous Page | 213 214 215 216 217 218 219 220 221 222 223 224  | Next Page >