Search Results

Search found 30293 results on 1212 pages for 'database insider'.

Page 314/1212 | < Previous Page | 310 311 312 313 314 315 316 317 318 319 320 321  | Next Page >

  • DB Architecture : Linking to intersection or to main tables?

    - by Jean-Nicolas
    Hi, I'm creating fantasy football system on my website but i'm very confuse about how I should link some of my table. Tables The main table is Pool which have all the info about the ruling of the fantasy draft. A standard table User, which contains the usual stuff. Intersection table called pools_users which contains id,pool_id,user_id because a user could be in more than one pool, and a pool contains more than 1 user. The problem Table Selections = that's the table that is causing problem. That's the selection that the user choose for his pool. This is related to the Player table but thats not relevant for this problem. Should I link this table to the table Pools_users or should I link it with both main table Pool and User. This table contains id,pool_id,user_id,player_id,... What is the best way link my tables? When I want to retrieve my data, I normally want the information to be divided BY users. "This user have those selections, this one those selections, etc).

    Read the article

  • Help me write a nicer SQL query in Rails

    - by Sainath Mallidi
    Hi, I am trying to write an SQL query to update some of the attributes that are regularly pulled from source. the output will be a text file with the following fields: author, title, date, popularity I have two tables to update one is the author information and the other is popularity table. And the Author Active Record object has one popularity. Currently I'm doing it like this.\ arr.each { |x| x = x.split(" ") results = Author.find_by_sql("SELECT authors.id FROM authors, priorities WHERE authors.id=popularity.authors_id AND authors.author = x[0]") results[0].popularity.update_attribute("popularity", x[3]) I need two tables because the popularity keeps changing, and I need only the top 1000 popular ones, but I still need to keep the previously popular ones also. Is there any nicer way to do this, instead of one query for every new object. Thanks.

    Read the article

  • Hibernate Auto-Increment not working

    - by dharga
    I have a column in my DB that is set with Identity(1,1) and I can't get hibernate annotations to work for it. I get errors when I try to create a new record. In my entity I have the following. @GeneratedValue(strategy=GenerationType.IDENTITY, generator="native") @Column(name="SeqNo", unique=true, nullable=false) BigDecimal seqNo; But when I try to add a new record I get the following error. Cannot insert explicit value for identity column in table 'MemberSelectedOptions' when IDENTITY_INSERT is set to OFF. I don't want to set IDENTIY_INSERT to ON because I want the identity column in the db to manage the values. The SQL that is run is the following; where you can clearly see the insert. insert into dbo.MemberSelectedOptions (OptionStatusCd, EffectiveDate, TermDate, SelectionStatusDate, SysLstUpdtUserId, SysLstTrxDtm, SourceApplication, GroupId, MemberId, OptionId, SeqNo) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) What am I missing?

    Read the article

  • Tablesorter - filter inside input fields and values

    - by Zeracoke
    I have a small quest to accomplish, and I reached a point when nothing works... So the problem is. I have a paged table with a lot of input fields inside the rows with values, and I would like to search inside these values. Let me Show this, I hope that somebody will got the idea what I should do... <script type="text/javascript"> // add parser through the tablesorter addParser method $.tablesorter.addParser({ id: 'inputs', is: function(s) { return false; }, format: function(s, table, cell, cellIndex) { var $c = $(cell); // return 1 for true, 2 for false, so true sorts before false if (!$c.hasClass('updateInput')) { $c .addClass('updateInput') .bind('keyup', function() { $(table).trigger('updateCell', [cell, false]); // false to prevent resort }); } return $c.find('input').val(); }, type: 'text' }); $(function() { $('table').tablesorter({ widgets: ['zebra', 'stickyHeaders', 'resizable', 'filter'], widgetOptions: { stickyHeaders : '', // number or jquery selector targeting the position:fixed element stickyHeaders_offset : 110, // added to table ID, if it exists stickyHeaders_cloneId : '-sticky', // trigger "resize" event on headers stickyHeaders_addResizeEvent : true, // if false and a caption exist, it won't be included in the sticky header stickyHeaders_includeCaption : true, // The zIndex of the stickyHeaders, allows the user to adjust this to their needs stickyHeaders_zIndex : 2, // jQuery selector or object to attach sticky header to stickyHeaders_attachTo : null, // scroll table top into view after filtering stickyHeaders_filteredToTop: true, resizable: true, filter_onlyAvail : 'filter-onlyAvail', filter_childRows : true, filter_startsWith : true, filter_useParsedData : true, filter_defaultAttrib : 'data-value' }, headers: { 1: {sorter: 'inputs', width: '50px'}, 2: {sorter: 'inputs'}, 3: {sorter: 'inputs'}, 4: {sorter: 'inputs'}, 5: {sorter: 'inputs'}, 6: {sorter: 'inputs'}, 7: {sorter: 'inputs', width: '100px'}, 8: {sorter: 'inputs', width: '140px'}, 9: {sorter: 'inputs'}, 10: {sorter: 'inputs'}, 11: {sorter: 'inputs'}, } }); $('table').tablesorterPager({container: $(".pager"), positionFixed: false, size: 50, pageDisplay : $(".pagedisplay"), pageSize : $(".pagesize"), }); $("#table1").tablesorter(options); /* make second table scroll within its wrapper */ options.widgetOptions.stickyHeaders_attachTo = '.wrapper'; // or $('.wrapper') $("#table2").tablesorter(options); }); </script> The structure of the tables: <tr class="odd" style="display: table-row;"> <form action="/self.php" method="POST"> </form><input type="hidden" name="f" value="data"> <td><input type="hidden" name="mod_id" value="741">741</td> <td class="updateInput"><input type="text" name="name" value="Test User Name"></td> <td class="updateInput"><input type="text" name="address" value="2548451 Random address"></td> <td class="updateInput"><input type="email" name="email" value=""></td> <td class="updateInput"><input type="text" name="entitlement" value="none"></td> <td class="updateInput"><input type="text" name="card_number" value="6846416548644352"></td> <td class="updateInput"><input type="checkbox" name="verify" value="1" checked=""></td> <td class="updateInput"><input type="checkbox" name="card_sended" value="1" checked=""></td> <td class="updateInput"><input type="text" name="create_date" value="2014-02-12 21:09:16"></td> <td class="updateInput"><a href="self.php?f=data&amp;del=741">X</a></td> <td class="updateInput"><input type="submit" value="SAVE"></td><td class="updateInput"></td></tr> So the thing is I don't know how to configure the filter to search these values... I already added some options, but none of them are working... Any help would be great!

    Read the article

  • How to select product that have the maximum price of each category?

    - by kimleng
    The below is my table that has the item such as: ProductId ProductName Category Price 1 Tiger Beer $12.00 2 ABC Beer $13.99 3 Anchor Beer $9.00 4 Apolo Wine $10.88 5 Randonal Wine $18.90 6 Wisky Wine $30.19 7 Coca Beverage $2.00 8 Sting Beverage $5.00 9 Spy Beverage $4.00 10 Angkor Beer $12.88 And I suppose that I have only three category in this table (I can have a lot of category in this table). And I want to show the maximum product's price of each category in this table.

    Read the article

  • What is the fastest way to get a DataTable into SQL Server?

    - by John Gietzen
    I have a DataTable in memory that I need to dump straight into a SQL Server temp table. After the data has been inserted, I transform it a little bit, and then insert a subset of those records into a permanent table. The most time consuming part of this operation is getting the data into the temp table. Now, I have to use temp tables, because more than one copy of this app is running at once, and I need a layer of isolation until the actual insert into the permanent table happens. What is the fastest way to do a bulk insert from a C# DataTable into a SQL Temp Table? I can't use any 3rd party tools for this, since I am transforming the data in memory. My current method is to create a parameterized SqlCommand: INSERT INTO #table (col1, col2, ... col200) VALUES (@col1, @col2, ... @col200) and then for each row, clear and set the parameters and execute. There has to be a more efficient way. I'm able to read and write the records on disk in a matter of seconds...

    Read the article

  • Why does this simple MySQL procedure take way too long to complete?

    - by Howard Guo
    This is a very simple MySQL stored procedure. Cursor "commission" has only 3000 records, but the procedure call takes more than 30 seconds to run. Why is that? DELIMITER // DROP PROCEDURE IF EXISTS apply_credit// CREATE PROCEDURE apply_credit() BEGIN DECLARE done tinyint DEFAULT 0; DECLARE _pk_id INT; DECLARE _eid, _source VARCHAR(255); DECLARE _lh_revenue, _acc_revenue, _project_carrier_expense, _carrier_lh, _carrier_acc, _gross_margin, _fsc_revenue, _revenue, _load_count DECIMAL; DECLARE commission CURSOR FOR SELECT pk_id, eid, source, lh_revenue, acc_revenue, project_carrier_expense, carrier_lh, carrier_acc, gross_margin, fsc_revenue, revenue, load_count FROM ct_sales_commission; DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = 1; DELETE FROM debug; OPEN commission; REPEAT FETCH commission INTO _pk_id, _eid, _source, _lh_revenue, _acc_revenue, _project_carrier_expense, _carrier_lh, _carrier_acc, _gross_margin, _fsc_revenue, _revenue, _load_count; INSERT INTO debug VALUES(concat('row ', _pk_id)); UNTIL done = 1 END REPEAT; CLOSE commission; END// DELIMITER ; CALL apply_credit(); SELECT * FROM debug;

    Read the article

  • Is foreign key reference from two different primary key from two different tables valid?

    - by arundex
    I have a foreign key that has to refer primary keys of two different tables. Table 1: animal animal_ id (primary key) Table 2: bird bird_ id (primary key) Table 3: Pet_info pet_id, type ENUM ('bird', 'animal') foreign key (pet_ id) references animal(animal_id), bird(bird_id) So, I need to check for pet_id either from animal or bird table depending on the need. Is this valid? Or should I go for some restructuring . . . NOTE: I referred this . . but I'm not sure whether I have to change my existing design

    Read the article

  • How can several different datatypes be saved in one table

    - by poseidon
    This is my situation: I am constructing an ad-like application in Django and Mysql. I am using a flexible-ad approach where we have: a table with ad categories (several categories such as home, furniture, cars, etc.) id_category name a table with details for the ad categories (home: area, squared meters. car: seats, color.) id_detail id_category (the categ the detail describes) name type (boolean, char, int, long, etc.) the ad table (i am selling a house. i am selling a car.) id_ad id_category text date a table where i plan to consolidate the details of the ads (home: A-area, 500 sq-meters. car: 5 seats, red.) id_detail_ad id_ad id_detail value Is this possible? Can I have a table of details for all the ads, even if details include numbers, texts, booleans, etc? Or would I have to save them all as text and then interpret them via code accordingly? Please express your opinions. Thank you.

    Read the article

  • How would you structure your entity model for storing arbitrary key/value data with different data t

    - by Nathan Ridley
    I keep coming across scenarios where it will be useful to store a set of arbitrary data in a table using a per-row key/value model, rather than a rigid column/field model. The problem is, I want to store the values with their correct data type rather than converting everything to a string. This means I have to choose either a single table with multiple nullable columns, one for each data type, or a set of value tables, one for each data type. I'm also unsure as to whether I should use full third normal form and separate the keys into a separate table, referencing them via a foreign key from the value table(s), or if it would be better to keep things simple and store the string keys in the value table(s) and accept the duplication of strings. Old/bad: This solution makes adding additional values a pain in a fluid environment because the table needs to be modified regularly. MyTable ============================ ID Key1 Key2 Key3 int int string date ---------------------------- 1 Value1 Value2 Value3 2 Value4 Value5 Value6 Single Table Solution This solution allows simplicity via a single table. The querying code still needs to check for nulls to determine which data type the field is storing. A check constraint is probably also required to ensure only one of the value fields contains non-nulll data. DataValues ============================================================= ID RecordID Key IntValue StringValue DateValue int int string int string date ------------------------------------------------------------- 1 1 Key1 Value1 NULL NULL 2 1 Key2 NULL Value2 NULL 3 1 Key3 NULL NULL Value3 4 2 Key1 Value4 NULL NULL 5 2 Key2 NULL Value5 NULL 6 2 Key3 NULL NULL Value6 Multiple-Table Solution This solution allows for more concise purposing of each table, though the code needs to know the data type in advance as it needs to query a different table for each data type. Indexing is probably simpler and more efficient because there are less columns that need indexing. IntegerValues =============================== ID RecordID Key Value int int string int ------------------------------- 1 1 Key1 Value1 2 2 Key1 Value4 StringValues =============================== ID RecordID Key Value int int string string ------------------------------- 1 1 Key2 Value2 2 2 Key2 Value5 DateValues =============================== ID RecordID Key Value int int string date ------------------------------- 1 1 Key3 Value3 2 2 Key3 Value6 How do you approach this problem? Which solution is better? Also, should the key column be separated into a separate table and referenced via a foreign key or be should it be kept in the value table and bulk updated if for some reason the key name changes?

    Read the article

  • SQL clone record with a unique index

    - by Milhous
    Is there a clean way of cloning a record in SQL that has an index(auto increment). I want to clone all the fields except the index. I currently have to enumerate every field, and use that in an insert select, and I would rather not explicitly list all of the fields, as they may change over time.

    Read the article

  • MySQL query problem

    - by Luke
    Ok, I have the following problem. I have two InnoDB tables: 'places' and 'events'. One place can have many events, but event can be created without entering place. In this case event's foreign key is NULL. Simplified structure of the tables looks as follows: CREATE TABLE IF NOT EXISTS `events` ( `id` int(11) unsigned NOT NULL AUTO_INCREMENT, `name` varchar(255) COLLATE utf8_bin NOT NULL, `places_id` int(9) unsigned DEFAULT NULL, PRIMARY KEY (`id`), KEY `fk_events_places` (`places_id`), ) ENGINE=InnoDB; ALTER TABLE `events` ADD CONSTRAINT `events_ibfk_1` FOREIGN KEY (`places_id`) REFERENCES `places` (`id`) ON DELETE CASCADE ON UPDATE NO ACTION; CREATE TABLE IF NOT EXISTS `places` ( `id` int(9) unsigned NOT NULL AUTO_INCREMENT, `name` varchar(255) COLLATE utf8_bin NOT NULL, PRIMARY KEY (`id`), ) ENGINE=InnoDB; Question is, how to construct query which contains name of the event and name of the corresponding place (or no value, in case there is no place assigned?). I am able to do it with two queries, but then I am visibly separating events which have place assigned from the ones that are without place. Help really appreciated.

    Read the article

  • Connecting to a fresh SQL Server installation

    - by ripper234
    I know mysql, and I'd like to learn sqlserver. I'm currently stuck on the basics of basics: How to install and configure sql server How to connect to it I installed Sql Server through Web Platform Installer, and have Visual Studio 2008 installed. Still, I can't understand how to connect to my server: I see that the SQL service itself (SQLEXPRESS) is running in both in services.msc and Sql Server Configuration Manager I try to connect to it via the Management Studio, but I don't understand what to do. Where do I begin?

    Read the article

  • make db connection persistent throught zend framework

    - by kamikaze_pilot
    I'm using zend framework. currently everytime I need to use the db I go ahead and connect to the DB: function connect(){ $connParams = array("host" => $host, "port" => $port, "username" => $username, "password" => $password, "dbname" => $dbname); $db = new Zend_Db_Adapter_Pdo_Mysql($connParams); return $db } so I would just call the connect() function everytime I need to use the db My question is...suppose I want to reuse $db everywhere in my site and only connect once in the very initial stage of the site load and then close the connection right before the site gets sent to the user, what would be the best practice to accomplish this? Which file in Zend should I save $db in, what method should I use to save it (global variable?), and which file should I do the connection closing in?

    Read the article

  • Fastest way to store/retrieve a dictionary - SQL, text file...?

    - by AP257
    Hi all, This is a really really super dumb question, so I apologise, but I'd be grateful for some advice. I've got a text file of words and word frequencies. It's very large - theoretically we're talking millions of rows. I just want to retrieve values from the file, and do it as quickly and efficiently as possible (for a web app, in Django). My question is: what is the best way to store and retrieve the values? Should import them into SQL? Or keep the file and use grep? Or put them into a JSON dictionary...? Or some other way? Sorry for the dumb question, would be very grateful for advice!

    Read the article

  • Virtual directories as DB queries

    I have a site, e.g. site.com I would like users to be able to access it in their locale at site.com/somecity This is similar to craigslist, but they do it with subdomains e.g. sfbay.craigslist.org Using Apache HTTP server. MySql for DB. If you can provide a brief explanation and perhaps links to more thorough discussions, I would be quite interested in learning. I'm developing a web-app and wonder if I should focus some time to read up on Apache, or should I focus more of my time on server-side programming. Thanks!

    Read the article

  • Walking through an SQLite Table

    - by galford13x
    I would like to implement or use functionality that allows stepping through a Table in SQLite. If I have a Table Products that has 100k rows, I would like to retrive perhaps 10k rows at a time. Somthing similar to how a webpage would list data and have a < Previous .. Next > link to walk through the data. Are there select statements that can make this simple? I see and have tried using the ROWID in conjunction with LIMIT which seems ok if not ordering the data. // This seems works if not ordering. SELECT * FROM Products WHERE ROWID BETWEEN x AND y;

    Read the article

  • How to handle expired items?

    - by Mark
    My site allows users to post things on the site with an expiry date. Once the item has expired, it will no longer be displayed in the listings. Posts can also be closed, canceled, or completed. I think it would be be nicest just to be able to check for one attribute or status ("is active") rather than having to check for [is not expired, is not completed, is not closed, is not canceled]. Handling the rest of those is easy because I can just have one "status" field which is essentially an enum, but AFAIK, it's impossible to set the status to "expired" as soon as that time occurs. How do people typically handle this?

    Read the article

  • Strange data swapping error occurs when I attempt to update rows in my table from another table in m

    - by Wesley
    So I have a table of data that is 10,000 lines long. Several of the columns in the table simply describe information about one of the columns, meaning, that only one column has the content, and the rest of the columns describe the location of the content (its for a book). Right now, only 6,000 of the 10,000 rows' content column is filled with its content. Rows 6-10,000's content column simply says null. I have another table in the db that has the content for rows 6,000-10,000, with the correct corresponding primary key which would (seemingly) make it easy to update the 10,000 row table. I have been trying an update query such as the following: UPDATE table(10,000) SET content_column = (SELECT content FROM table(6,000-10,000) WHERE table(10,000).id = table(6-10,000.id) Which kind of works, the only problem is that it pulls in the data from the second table just fine, but it replaces the existing content column with null. So rows 1-6,000's content column become null, and rows 6-10,000's content column have the correct values...Pretty strange I thought anyway. Does anybody have any thoughts about where I am going wrong? If you could show me a better sql query, I would appreciate it! Thanks

    Read the article

  • Can't create a MySQL query that generates 4 rows for each row in the table it references.

    - by UkraineTrain
    I need to create a MySQL query that generates 4 rows for each row in the table it references. I need some of the information in those rows to repeat and some to be different. In the table each row stands for one day. I need to break the day up in 6 hour increments, hence the four rows for each entry. I need to create one column which for each day will have '12AM', '6AM', '12PM', and '6PM' values and another column will have the corresponding numeric values calculated for those entries. Thanks a lot in advance and I will really appreciate any help on this.

    Read the article

< Previous Page | 310 311 312 313 314 315 316 317 318 319 320 321  | Next Page >