Search Results

Search found 20859 results on 835 pages for 'little bobby tables'.

Page 12/835 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Approach for altering Primary Key from GUID to BigInt in SQL Server related tables

    - by Tom
    I have two tables with 10-20 million rows that have GUID primary keys and at leat 12 tables related via foreign key. The base tables have 10-20 indexes each. We are moving from GUID to BigInt primary keys. I'm wondering if anyone has any suggestions on an approach. Right now this is the approach I'm pondering: Drop all indexes and fkeys on all the tables involved. Add 'NewPrimaryKey' column to each table Make the key identity on the two base tables Script the data change "update table x, set NewPrimaryKey = y where OldPrimaryKey = z Rename the original primarykey to 'oldprimarykey' Rename the 'NewPrimaryKey' column 'PrimaryKey' Script back all the indexes and fkeys Does this seem like a good approach? Does anyone know of a tool or script that would help with this? TD: Edited per additional information. See this blog post that addresses an approach when the GUID is the Primary: http://www.sqlmag.com/blogs/sql-server-questions-answered/sql-server-questions-answered/tabid/1977/entryid/12749/Default.aspx

    Read the article

  • A little help with MVC

    - by s0mmer
    Hi everyone, I'm working on a cocoa app for syncing data between two folders. It have profiles (so you can have multiple setups) It's possible to analyze data It's possible to sync the data Im a little confused. First of all i cant really see where to have a model? And how many controller would you suggest? 1 WindowController or AnalyzeController, SyncController etc. Its quite a while since i have worked with MVC. I've read some articles but i'm missing concrete examples on how to divide it. Best regards.

    Read the article

  • Creating tables with pylons and SQLAlchemy

    - by Sid
    I'm using SQLAlchemy and I can create tables that I have defined in /model/__init__.py but I have defined my classes, tables and their mappings in other files found in the /model directory. For example I have a profile class and a profile table which are defined and mapped in /model/profile.py To create the tables I run: paster setup-app development.ini But my problem is that the tables that I have defined in /model/__init__.py are created properly but the table definitions found in /model/profile.py are not created. How can I execute the table definitions found in the /model/profile.py so that all my tables can be created? Thanks for the help!

    Read the article

  • Bit/Byte adressing - Little/Big-endnian

    - by code8230
    Consider the 16-Bit data packet below, which is sent through the network in network byte order ie Big Endian: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 (Byte num) 34 67 89 45 90 AB FF 23 65 37 56 C6 56 B7 00 00 (Value) Lets say 8945 is a 16 bit value. All others are 8 bit data bytes. On my system, which is little endian, how would the data be received and stored? Lets say, we are configured to receive 8 bytes at a time. RxBuff is the Rx buffer where data will be received. Buff is the storage buffer where data would be stored. Please point out which case is correct for data storage after reading 8 bytes at a time: 1) Buff[] = {0x34, 0x67, 0x45, 0x89, 0x90, 0xAB....... 0x00}; 2) Buff[] = {0x00, 0x00, .......0x67, 0x89, 0x45, 0x34}; Would the whole 16 bytes data be reversed or only the 2 bytes value contained in this packet?

    Read the article

  • Entity Framework Create Database & Tables At Runtime

    - by dhsto
    I created some tables in an .edmx file and have been generating the database by selecting "Generate Database From Model" and manually executing an .edmx.sql file on the database to build the tables. Now, however, I am creating a setup dialog that allows the user to connect the program up to their own database. I thought running context.CreateDatabase would be good enough to create the database, along with the tables, but the tables are not created. What is the preferred method for creating the database and tables when the user specifies their own server and database to use, when originally starting with a model?

    Read the article

  • Storing SQL Tables for use in visual studio

    - by Raven Dreamer
    Greetings. I'm trying to create a windows form application that manipulates data from several tables stored on a SQL server. 1) What's the best way to store the data locally, while the application is running? I had a previous program that only modified one table, and that was set up to use a datagridview. However, as I don't necessarily want to view all the tables, I am looking for another way to store the data retrieved by the SELECT * FROM ... query. 2) Is it better to load the tables, make changes within the C# application, and then update the modified tables at the end, or simply perform all operations on the database, remotely (retrieving the tables each time they are needed)? Thank you.

    Read the article

  • Sorting the columns of an HTML table using JQuery

    - by nikolaosk
    In this post I will show you how easy is to sort the columns of an HTML table. I will use an external library,called Tablesorter which makes life so much easier for developers. ?here are other posts in my blog regarding JQuery.You can find them all here. You can find another post regarding HTML tables and JQuery here. We will demonstrate this with a step by step example. I will use Visual Studio 2012 Ultimate. You can also use Visual Studio 2012 Express Edition. You can also use VS 2010 editions.   1) Launch Visual Studio. Create an ASP.Net Empty Web application. Choose an appropriate name for your application. 2) Add a web form, default.aspx page to the application. 3) Add a table from the HTML controls tab control (from the Toolbox) on the default.aspx page 4) Now we need to download the JQuery library. Please visit the http://jquery.com/ and download the minified version.Then we need to download the Tablesorter JQuery plugin. Please donwload it, here. 5) We need to reference the JQuery library and the external JQuery Plugin. In the head section ? add the following lines.   <script src="jquery-1_8_2_min.js" type="text/javascript"></script>  <script src="jquery.tablesorter.js" type="text/javascript"></script>6) We need to type the HTML markup, the HTML table and its columns <body>    <form id="form1" runat="server">    <div>        <h1>Liverpool Legends</h1>        <table style="width: 50%;" border="1" cellpadding="10" cellspacing ="10" class="liverpool">            <thead>                <tr><th>Defenders</th><th>MidFielders</th><th>Strikers</th></tr>            </thead>            <tbody>            <tr>                <td>Alan Hansen</td>                <td>Graeme Souness</td>                <td>Ian Rush</td>            </tr>            <tr>                <td>Alan Kennedy</td>                <td>Steven Gerrard</td>                <td>Michael Owen</td>            </tr>            <tr>                <td>Jamie Garragher</td>                <td>Kenny Dalglish</td>                <td>Robbie Fowler</td>            </tr>            <tr>                <td>Rob Jones</td>                <td>Xabi Alonso</td>                <td>Dirk Kuyt</td>            </tr>                </tbody>        </table>            </div>    </form></body> 7) Inside the head section we also write the simple JQuery code.   <script type="text/javascript"> $(document).ready(function() { $('.liverpool').tablesorter(); }); </script> 8) Run your application.This is how the HTML table looks before the table is sorted on the basis of the selected column.   9) Now I will click on the Midfielders header.Have a look at the picture below  Tablesorter is an excellent JQuery plugin that makes sorting HTML tables a piece of cake. Hope it helps!!!

    Read the article

  • Read multiple tables from dataset in Powershell

    - by Lucas
    I am using a function that collects data from a SQL server: function Invoke-SQLCommand { param( [string] $dataSource = "myserver", [string] $dbName = "mydatabase", [string] $sqlCommand = $(throw "Please specify a query.") ) $SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnection.ConnectionString = "Server=$dataSource;Database=$dbName;Integrated Security=True" $SqlCmd = New-Object System.Data.SqlClient.SqlCommand $SqlCmd.CommandText = $sqlCommand $SqlCmd.Connection = $SqlConnection $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlAdapter.SelectCommand = $SqlCmd $DataSet = New-Object System.Data.DataSet $SqlAdapter.Fill($DataSet) $SqlConnection.Close() $DataSet.Tables[0] } It works great but returns only one table. I am passing several Select statements, so the dataset contains multiple tables. I replaced $DataSet.Tables[0] with for ($i=0;$i -lt $DataSet.tables.count;$i++){ $Dataset.Tables[$i] } but the console only shows the content of the first table and blank lines for each records of what should be the second table. The only way to see the result is to change the code to $Dataset.Tables[$i] | out-string but I do not want strings, I want to have table objects to work with. When I assign what is returned by the Invoke-SQLCommand to a variable, I can see that I have an array of datarow objects but only from the first table. What happened to the second table? Any help would be greatly appreciated. Thanks

    Read the article

  • Where are tables in Mnesia located?

    - by Sanoj
    I try to compare Mnesia with more traditional databases. As I understand it tables in Mnesia can be located to: ram_copies - tables are stored in RAM only, so no durability as in ACID. disc_copies - tables are located on disc and a copy is located in RAM, so the table can not be bigger than the available memory? disc_only_copies - tables are located to disc only, so no caching in memory and worse performance? And the size of the table are limited to the size of dets or the table has to be fragmented. So if I want the performance of doing reads from RAM and the durability of writes to disc, then the size of the tables are very limited compared to a traditional RDBMS like MySQL or PostgreSQL. I know that Mnesia aren't meant to replace traditional RDBMS:s, but can it be used as a big RDBMS or do I have to look for another database? The server I will use is a VPS with limited amount of memory, around 512MB, but I want good database performance. Are disc_copies and the other types of tables in Mnesia so limited as I have understood?

    Read the article

  • Setting up DrJava to work through Friedman / Felleisen "A Little Java"

    - by JDelage
    All, I'm going through the Friedman & Felleisen book "A Little Java, A Few Patterns". I'm trying to type the examples in DrJava, but I'm getting some errors. I'm a beginner, so I might be making rookie mistakes. Here is what I have set-up: public class ALittleJava { //ABSTRACT CLASS POINT abstract class Point { abstract int distanceToO(); } class CartesianPt extends Point { int x; int y; int distanceToO(){ return((int)Math.sqrt(x*x+y*y)); } CartesianPt(int _x, int _y) { x=_x; y=_y; } } class ManhattanPt extends Point { int x; int y; int distanceToO(){ return(x+y); } ManhattanPt(int _x, int _y){ x=_x; y=_y; } } } And on the main's side: public class Main{ public static void main (String [] args){ Point y = new ManhattanPt(2,8); System.out.println(y.distanceToO()); } } The compiler cannot find the symbols Point and ManhattanPt in the program. If I precede each by ALittleJava., I get another error in the main, i.e., an enclosing instance that contains ALittleJava.ManhattanPt is required I've tried to find ressources on the 'net, but the book must have a pretty confidential following and I couldn't find much. Thank you all. JDelage

    Read the article

  • sqlite3 JOIN, GROUP_CONCAT using distinct with custom separator

    - by aiwilliams
    Given a table of "events" where each event may be associated with zero or more "speakers" and zero or more "terms", those records associated with the events through join tables, I need to produce a table of all events with a column in each row which represents the list of "speaker_names" and "term_names" associated with each event. However, when I run my query, I have duplication in the speaker_names and term_names values, since the join tables produce a row per association for each of the speakers and terms of the events: 1|Soccer|Bobby|Ball 2|Baseball|Bobby - Bobby - Bobby|Ball - Bat - Helmets 3|Football|Bobby - Jane - Bobby - Jane|Ball - Ball - Helmets - Helmets The group_concat aggregate function has the ability to use 'distinct', which removes the duplication, though sadly it does not support that alongside the custom separator, which I really need. I am left with these results: 1|Soccer|Bobby|Ball 2|Baseball|Bobby|Ball,Bat,Helmets 3|Football|Bobby,Jane|Ball,Helmets My question is this: Is there a way I can form the query or change the data structures in order to get my desired results? Keep in mind this is a sqlite3 query I need, and I cannot add custom C aggregate functions, as this is for an Android deployment. I have created a gist which makes it easy for you to test a possible solution: https://gist.github.com/4072840

    Read the article

  • Need little assistance

    - by Umaid
    I am iterating in current days, so need little assistance for (int I=-1; I<30; I++) { for (int J=0; J=30; J++) { for (int K=1; K=30; K++) { SELECT rowid,Month, Day, Advice from MainCategory where Month= 'May ' and Day in ((cast(strftime('%d',date('now','I day')) as Integer)),(cast(strftime('%d',date('now','J day')) as Integer)),(cast(strftime('%d',date('now','K day')) as Integer))); } } } What if i want to go in reverse order also for (int I=-1; I<30; I--) { for (int J=0; J=30; J--) { for (int K=1; K=30; K--) { SELECT rowid,Month, Day, Advice from MainCategory where Month= 'May ' and Day in ((cast(strftime('%d',date('now','I day')) as Integer)),(cast(strftime('%d',date('now','J day')) as Integer)),(cast(strftime('%d',date('now','K day')) as Integer))); } } } On every previous click, i want to fetch 3 records so do i need to iterate till 3 or make it on all record 30 in a month from which i want to fetch.

    Read the article

  • A little confused about MVC and where to put a database query

    - by jax
    OK, so my Joomla app is in MVC format. I am still a little confused about where to put certain operations, in the Controller or in the Model. This function below is in the controller, it gets called when &task=remove. Should the database stuff be in the Model? It does not seem to fit there because I have two models editapp (display a single application) and allapps (display all the applications), now which one would I put the delete operation in? /** * Delete an application */ function remove() { global $mainframe; $cid = JRequest::getVar( 'cid', array(), '', 'array' ); $db =& JFactory::getDBO(); //if there are items to delete if(count($cid)){ $cids = implode( ',', $cid ); $query = "DELETE FROM #__myapp_apps WHERE id IN ( $cids )"; $db->setQuery( $query ); if (!$db->query()){ echo "<script> alert('".$db->getErrorMsg()."');window.history.go(-1); </script>\n"; } } $mainframe->redirect( 'index.php?option=' . $option . '&c=apps'); } I am also confused about how the flow works. For example, there is a display() function in the controller that gets called by default. If I pass a task, does the display() function still run or does it go directly to the function name passed by $task?

    Read the article

  • Font for tabs looks a little too big

    - by cf_PhillipSenn
    I'm using the default for jQueryUI, but it looks like the font is a little big. I know that one solution would be "WELL! JUST MAKE IT SMALLER!", but I'm just wondering if I've messed something up or I don't have a value set correctly before I charge in and start changing things. <!DOCTYPE HTML> <html> <head> <script src="http://www.google.com/jsapi"></script> <script type="text/javascript"> google.load("jquery", "1"); </script> <link rel="stylesheet" href="http://ajax.googleapis.com/ajax/libs/jqueryui/1/themes/smoothness/jquery-ui.css" type="text/css" media="all" /> <script type="text/javascript"> google.load("jqueryui", "1"); function OnLoadCallbackUI(){ $('#tabs').tabs(); } google.setOnLoadCallback(OnLoadCallbackUI); </script> </head> <body> <div id="tabs"> <ul> <li><a href="#tabs-1">tab1</a></li> <li><a href="#tabs-2">tab2</a></li> </ul> <div id="tabs-1"> tabs-1 </div> <div id="tabs-2"> tabs-2 </div> </div> </body> </html>

    Read the article

  • Mysql- “FLUSH TABLES WITH READ LOCK” started automatically

    - by mingyeow
    I would like to understand how this happened. I was running a query that would take a long time, but should not lock up any table. However, my dbs were practically down - it seems like it was being locked up by "FLUSH TABLES WITH READ LOCK" 03:21:31 select type_id, count(*) from guid_target_infos group by type_id 02:38:11 select type_id, count(*) from guid_infos group by type_id 02:24:29 FLUSH TABLES WITH READ LOCK But i did not start this command. can someone tell me why it was started automatically?

    Read the article

  • PostgreSQL lots of Tables

    - by strife911
    Hi, we am at a point where I have more than a thousand Tables in our PostgreSQL database server. I remember reading that there was a way to speed up the database once it reached more than a thousand Tables, but I cannot seem to find any mention of this on the Web with Google. Any help would be nice. Thanks

    Read the article

  • mysqldump triggering repair of MySQL tables

    - by Rhodri
    I have an automated backup of a 6 Gigabyte MySQL database running very two hours. I also have a script which checks every minute for the need to repair MySQL tables. Increasingly I'm getting tables having to be repaired during the backup process with the message returned of: Auto-increment value: 0 is smaller than max used value: xx Is this being caused by corruption? Are the two scripts conflicting? Any ideas?

    Read the article

  • Backing up MySQL DB wtih mixture of innodb and myisam tables

    - by madphp
    I have a large database (almost 1GB) and it has a mixture of innodb and myisam tables. Does anyone have any general tips when backing it up or more specifically the commands i should send to mysqldump. I see that i should lock myisam tables, and that single transactions for innodb, but what if i have both. Also, what is actually happening when i lock an entire (very big) table on a production database.

    Read the article

  • Using IP Tables to deny packet patterns?

    - by Chris
    I'm not experienced with IP tables but it's something I'll be looking into if this is plausible. I'm looking to set up a system to inspect packets and look for a pattern similar to korek's chop chop attack. Is there a way to set up the IP tables to defend against this attack? Thanks

    Read the article

  • Resources about Excel tables and structured references?

    - by jtolle
    I'm new to Excel-post-2000, and I'd like to learn more about how to use tables (formerly lists) and structured references. Can anyone point me to some good treatments of this topic that go beyond the help? (For example, there are numerous full books about just pivot tables. Something like that for using table would be ideal.)

    Read the article

  • Was a Big Fish in a Little Pond, Am Now a Little Fish in a Big Pond. How Do I Grow? [closed]

    - by Ziv
    I've finished high school where I was in the top three in my class, I studied a little and there too I was pretty much Big Fish in a bigger pond than high school. Now I got into my first job in a very big company, there are some incredibly talented programmers and researchers here (mostly in departments not related to mine) and for the first time I really feel like I'm incredibly average - I do not want to be average. I read technical books all the time, I try to code on my personal time but I don't feel like that's enough. What can I do to become a leading programmer again in this big company? Is there anything specifically that can be done to make myself known here? This is a very big company so in order to advance you must be very good and shine in your field.

    Read the article

  • Microsoft SyncFramework - Sync different tables into one

    - by evnu
    Hello, we are trying to get the Microsoft SyncFramework running in our application to synchronize an oracle db with a mobile device. Problem The queries that we need to gather the data on the oracle db take much time (and we haven't found a way to speed them up yet), so we try to split them up in as much portions as possible. One big part of the whole problem is, that we need different information out of one big table, that bloats a query if combined. Unfortunately, the SyncFramework allows only one TableAdapter per SyncTable. Now this is a problem for our application: If we were able to use more than one TableAdapter per SyncTable, we could easily spread the queries in a more efficient way. Using one query per Table which combines all the needed data takes way too much time. Ideas I thought of creating different TableAdapters for each one of the required queries and then merge the resulting datasets afterwards (preferably on the server). This seems to work, but is a rather awkward solution. Does someone of you know a better solution? Or do you have some ideas that could help? Thanks in advance, evnu EDIT: So, I implemented the merge solution. If you are interested, take a look at the following code. I'll give more details if there are questions. <WebMethod()> _ Public Function GetChanges(ByVal groupMetadata As SyncGroupMetadata, ByVal syncSession As SyncSession) As SyncContext Dim stream As MemoryStream Dim format As BinaryFormatter = New BinaryFormatter Dim anchors As Dictionary(Of String, Byte()) ' keep track of the tables that will be updated Dim addTables As Dictionary(Of String, List(Of SyncTableMetadata)) = New Dictionary(Of String, List(Of SyncTableMetadata)) ' list of all present anchors Dim allAnchors As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) ' fill allAnchors - deserialize all given anchors For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If Table.LastReceivedAnchor Is Nothing Or Table.LastReceivedAnchor.IsNull Then Continue For stream = New MemoryStream(Table.LastReceivedAnchor.Anchor) anchors = format.Deserialize(stream) For Each item As KeyValuePair(Of String, Byte()) In anchors allAnchors.Add(item.Key, item.Value) Next stream.Dispose() Next For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If allAnchors.ContainsKey(Table.TableName) Then Table.LastReceivedAnchor.Anchor = allAnchors(Table.TableName) End If Dim addSyncTables As List(Of SyncTableMetadata) If syncSession.SyncParameters.Contains(Table.TableName) Then Dim tableNames() As String = syncSession.SyncParameters(Table.TableName).Value.ToString.Split(":") addSyncTables = New List(Of SyncTableMetadata) For Each tableName As String In tableNames Dim newSynctable As SyncTableMetadata = New SyncTableMetadata newSynctable.TableName = tableName If allAnchors.ContainsKey(tableName) Then Dim anker As SyncAnchor = New SyncAnchor(allAnchors(tableName)) newSynctable.LastReceivedAnchor = anker Else newSynctable.LastReceivedAnchor = Nothing End If newSynctable.SyncDirection = Table.SyncDirection addSyncTables.Add(newSynctable) Next addTables.Add(Table.TableName, addSyncTables) End If Next ' add the newly created synctables For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value groupMetadata.TablesMetadata.Add(Table) Next Next ' fire queries Dim context As SyncContext = servSyncProvider.GetChanges(groupMetadata, syncSession) ' merge resulting datasets For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value If context.DataSet.Tables.Contains(Table.TableName) Then If Not context.DataSet.Tables.Contains(item.Key) Then Dim tmp As DataTable = context.DataSet.Tables(Table.TableName).Copy tmp.TableName = item.Key context.DataSet.Tables.Add(tmp) Else context.DataSet.Tables(item.Key).Merge(context.DataSet.Tables(Table.TableName)) context.DataSet.Tables.Remove(Table.TableName) End If End If Next Next ' create new anchors Dim allAnchorsDict As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata allAnchorsDict.Add(Table.TableName, context.NewAnchor.Anchor) Next stream = New MemoryStream format.Serialize(stream, allAnchorsDict) context.NewAnchor.Anchor = stream.ToArray stream.Dispose() Return context End Function

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >