Search Results

Search found 10101 results on 405 pages for 'temporary tables'.

Page 337/405 | < Previous Page | 333 334 335 336 337 338 339 340 341 342 343 344  | Next Page >

  • ASP.NET and VB.NET OleDbConnection Problem

    - by Matt
    I'm working on an ASP.NET website where I am using an asp:repeater with paging done through a VB.NET code-behind file. I'm having trouble with the database connection though. As far as I can tell, the paging is working, but I can't get the data to be certain. The database is a Microsoft Access database. The function that should be accessing the database is: Dim pagedData As New PagedDataSource Sub Page_Load(ByVal obj As Object, ByVal e As EventArgs) doPaging() End Sub Function getTheData() As DataTable Dim DS As New DataSet() Dim strConnect As New OleDbConnection("Provider = Microsoft.Jet.OLEDB.4.0;Data Source=App_Data/ArtDatabase.mdb") Dim objOleDBAdapter As New OleDbDataAdapter("SELECT ArtID, FileLocation, Title, UserName, ArtDate FROM Art ORDER BY Art.ArtDate DESC", strConnect) objOleDBAdapter.Fill(DS, "Art") Return DS.Tables("Art").Copy End Function Sub doPaging() pagedData.DataSource = getTheData().DefaultView pagedData.AllowPaging = True pagedData.PageSize = 2 Try pagedData.CurrentPageIndex = Int32.Parse(Request.QueryString("Page")).ToString() Catch ex As Exception pagedData.CurrentPageIndex = 0 End Try btnPrev.Visible = (Not pagedData.IsFirstPage) btnNext.Visible = (Not pagedData.IsLastPage) pageNumber.Text = (pagedData.CurrentPageIndex + 1) & " of " & pagedData.PageCount ArtRepeater.DataSource = pagedData ArtRepeater.DataBind() End Sub The ASP.NET is: <asp:Repeater ID="ArtRepeater" runat="server"> <HeaderTemplate> <h2>Items in Selected Category:</h2> </HeaderTemplate> <ItemTemplate> <li> <asp:HyperLink runat="server" ID="HyperLink" NavigateUrl='<%# Eval("ArtID", "ArtPiece.aspx?ArtID={0}") %>'> <img src="<%# Eval("FileLocation") %>" alt="<%# DataBinder.Eval(Container.DataItem, "Title") %>t"/> <br /> <%# DataBinder.Eval(Container.DataItem, "Title") %> </asp:HyperLink> </li> </ItemTemplate> </asp:Repeater>

    Read the article

  • "QFontEngine(Win) GetTextMetrics failed ()" error on 64-bit Windows

    - by David Murdoch
    I'll add 500 of my own rep as a bounty when SO lets me. I'm using wkhtmltopdf to convert HTML web pages to PDFs. This works perfectly on my 32-bit dev server [unfortunately, I can't ship my machine :-p ]. However, when I deploy to the web application's 64-bit server the following errors are displayed: C:\>wkhtmltopdf http://www.google.com google.pdf Loading pages (1/5) QFontEngine::loadEngine: GetTextMetrics failed () ] 10% QFontEngineWin: GetTextMetrics failed () QFontEngineWin: GetTextMetrics failed () QFontEngine::loadEngine: GetTextMetrics failed () QFontEngineWin: GetTextMetrics failed () QFontEngineWin: GetTextMetrics failed () QFontEngineWin: GetTextMetrics failed () QFontEngine::loadEngine: GetTextMetrics failed () ] 36% QFontEngineWin: GetTextMetrics failed () QFontEngineWin: GetTextMetrics failed () // ...etc.... and the PDF is created and saved... just WITHOUT text. All form-fields, images, borders, tables, divs, spans, ps, etc are rendered accurately...just void of any text at all. Server information: Windows edition: Windows Server Standard Service Pack 2 Processor: Intel Xeon E5410 @ 2.33GHz 2.33 GHz Memory: 8.00 GB System type: 64-bit Operating System Can anyone give me a clue as to what is happening and how I can fix this? Also, I wasn't sure what to tag/title this question with...so if you can think of better tags/title comment them or edit the question. :-)

    Read the article

  • Merge replication server side foreign key violation from unpublished table

    - by Reiste
    We are using SQL Server 2005 Merge Replication with SQL CE 3.5 clients. We are using partitions with filtering for the separate subscriptions, and nHibernate for the ORM mapping. There is automatic ID range management from SQL Server for the subscriptions. We have a table, Item, and a table with a foreign key to Item - ItemHistory. Both of these are replicated down, filtered according to the subscription. Item has a column called UserId, and is filtered per subscription with this filter: WHERE UserId IN (SELECT... [complicated subselect]...) ItemHistory hangs off Item in the publication filter articles. On the server, we have a table ItemHistoryExport, which has a foreign key to ItemHistory. ItemHistoryExport is not published. Entries in the Item and ItemHistory tables are never deleted, on the server or the client. However, the "ownership" of items (and hence their ItemHistories) MAY change, which causes them to be moved from one client subscription/partition to another from time to time. When we sync, we occasionally get the following error: A row delete at '48269404 - 4108383dbb11' could not be propagated to 'MyServer\MyInstance.MyDatabase'. This failure can be caused by a constraint violation. The DELETE statement conflicted with the REFERENCE constraint "FK_ItemHistoryExport_ItemHistory". The conflict occurred in database "MyDatabase", table "dbo.ItemHistoryExport", column 'ItemHistoryId'. Can anyone help us understand why this happens? There shouldn't ever be a delete happening on the server side.

    Read the article

  • Linq: the linked objects are null, why?

    - by user46503
    Hello, I have several linked tables (entities). I'm trying to get the entities using the following linq: ObjectQuery<Location> locations = context.Location; ObjectQuery<ProductPrice> productPrice = context.ProductPrice; ObjectQuery<Product> products = context.Product; IQueryable<ProductPrice> res1 = from pp in productPrice join loc in locations on pp.Location equals loc join prod in products on pp.Product equals prod where prod.Title.ToLower().IndexOf(Word.ToLower()) > -1 select pp; This query returns 2 records, ProductPrice objects that have linked object Location and Product but they are null and I cannot understand why. If I try to fill them in the linq as below: res = from pp in productPrice join loc in locations on pp.Location equals loc join prod in products on pp.Product equals prod where prod.Title.ToLower().IndexOf(Word.ToLower()) > -1 select new ProductPrice { ProductPriceId = pp.ProductPriceId, Product = prod }; I have the exception "The entity or complex type 'PBExplorerData.ProductPrice' cannot be constructed in a LINQ to Entities query" Could someone please explain me what happens and what I need to do? Thanks

    Read the article

  • Linq to sql DataContext cannot set load options after results been returned

    - by David Liddle
    I have two tables A and B with a one-to-many relationship respectively. On some pages I would like to get a list of A objects only. On other pages I would like to load A with objects in B attached. This can be handled by setting the load options DataLoadOptions options = new DataLoadOptions(); options.LoadWith<A>(a => a.B); dataContext.LoadOptions = options; The trouble occurs when I first of all view all A's with load options, then go to edit a single A (do not use load options), and after edit return to the previous page. I understand why the error is occurring but not sure how to best get round this problem. I would like the DataContext to be loaded up per request. I thought I was achieving this by using StructureMap to load up my DataContext on a per request basis. This is all part of an n-tier application where my Controllers call Services which in turn call Repositories. ForRequestedType<MyDataContext>() .CacheBy(InstanceScope.PerRequest) .TheDefault.Is.Object(new MyDataContext()); ForRequestedType<IAService>() .TheDefault.Is.OfConcreteType<AService>(); ForRequestedType<IARepository>() .TheDefault.Is.OfConcreteType<ARepository>(); Here is a brief outline of my Repository public class ARepository : IARepository { private MyDataContext db; public ARepository(MyDataContext context) { db = context; } public void SetLoadOptions(DataLoadOptions options) { db.LoadOptions = options; } public IQueryable<A> Get() { return from a in db.A select a; } So my ServiceLayer, on View All, sets the load options and then gets all A's. On editing A my ServiceLayer should spin up a new DataContext and just fetch a list of A's. When sql profiling, I can see that when I go to the Edit page it is requesting A with B objects.

    Read the article

  • SSIS Lookup with Lookup Component Vs Script Component.

    - by Nev_Rahd
    Hello, I need to load Dimensions from EDW Tables (which does maintain historical records) and is of type Key-Value-Parameter. My scenario is ok if got a record in EDW as below Key1 Key2 Code Value EffectiveDate EndDate CurrentFlag 100 555 01 AAA 2010-01-01 11.00.00 9999-12-31 Y 100 555 02 BBB 2010-01-01 11.00.00 9999-12-31 Y This need to be loaded into DM by pivoting it as key1 and key2 combinations makes Natural key for DM SK NK 01 02 EffectiveDate EndDate CurrentFlag 1 100-555 AAA BBB 2010-01-01 11.00.00 9999-12-31 Y My ssis package does this all good pivoting... looking up the incoming NK in DIM.. if new will insert .. else with further lookup with effective date and determine if the incoming for same natural key got any new (change) in attribute.. if so updates the current record byy setting its end date and insert the new one with new attribute value and pulling the recent records values for other attributes. My problem is if the same natural key comes twice with same attribute in single extract my first lookup which on natural key .. will let both records pass and try to insert.. where its fails. If i get distinct records on NK the second is not picked and need to run package again. So my question how can i configure lookup or alernative way to handle this scenario when same NK comes twice in single extract, would be able to insert first record if not exists in Dim table and for second one should be able to updated with the changes with reference to one inserted above. Not sure this makes sense what am trying to explain. Will attached the screenshot once back to work desk (on monday). Thanks

    Read the article

  • How do i write this jpql query?

    - by Nitesh Panchal
    Hello, Say i have 5 tables, tblBlogs tblBlogPosts tblBlogPostComment tblUser tblBlogMember BlogId BlogPostsId BlogPostCommentId UserId BlogMemberId BlogTitle BlogId CommentText FirstName UserId PostTitle BlogPostsId BlogId BlogMemberId Now i want to retrieve only those blogs and posts for which blogMember has actually commented. So in short, how do i write this plain old sql :- Select b.BlogTitle, bp.PostTitle, bpc.CommentText from tblBlogs b Inner join tblBlogPosts bp on b.BlogId = bp.BlogId Inner Join tblBlogPostComment bpc on bp.BlogPostsId = bpc.BlogPostsId Inner Join tblBlogMember bm On bpc.BlogMemberId = bm.BlogMemberId Where bm.UserId = 1; As you can see, everything is Inner join, so only that row will be retrieved for which the user has commented on some post of some blog. So, suppose he has joined 3 blogs whose ids are 1,2,3 (The blogs which user has joined are in tblBlogMembers) but the user has only commented in blog 2 (of say BlogPostId = 1). So that row will be retrieved and 1,3 won't as it is Inner Join. How do i write this kind of query in jpql? In jpql, we can only write simple queries like say :- Select bm.blogId from tblBlogMember Where bm.UserId = objUser; Where objUser is supplied using :- em.find(User.class,1); Thus once we get all blogs(Here blogId represents a blog object) which user has joined, we can loop through and do all fancy things. But i don't want to fall in this looping business and write all this things in my java code. Instead, i want to leave that for database engine to do. So, how do i write the above plain sql into jpql? and what type of object the jpql query will return? because i am only selecting few fields from all table. In which class should i typecast the result to? I think i posted my requirement correctly, if i am not clear please let me know. Thanks in advance :).

    Read the article

  • Is possible to reuse subqueries?

    - by Gothmog
    Hello, I'm having some problems trying to perform a query. I have two tables, one with elements information, and another one with records related with the elements of the first table. The idea is to get in the same row the element information plus several records information. Structure could be explain like this: table [ id, name ] [1, '1'], [2, '2'] table2 [ id, type, value ] [1, 1, '2009-12-02'] [1, 2, '2010-01-03'] [1, 4, '2010-01-03'] [2, 1, '2010-01-02'] [2, 2, '2010-01-02'] [2, 2, '2010-01-03'] [2, 3, '2010-01-07'] [2, 4, '2010-01-07'] And this is want I would like to achieve: result [id, name, Column1, Column2, Column3, Column4] [1, '1', '2009-12-02', '2010-01-03', , '2010-01-03'] [2, '2', '2010-01-02', '2010-01-02', '2010-01-07', '2010-01-07'] The following query gets the proper result, but it seems to me extremely inefficient, having to iterate table2 for each column. Would be possible in anyway to do a subquery and reuse it? SELECT a.id, a.name, (select min(value) from table2 t where t.id = subquery.id and t.type = 1 group by t.type) as Column1, (select min(value) from table2 t where t.id = subquery.id and t.type = 2 group by t.type) as Column2, (select min(value) from table2 t where t.id = subquery.id and t.type = 3 group by t.type) as Column3, (select min(value) from table2 t where t.id = subquery.id and t.type = 4 group by t.type) as Column4 FROM (SELECT distinct id FROM table2 t WHERE (t.type in (1, 2, 3, 4)) AND t.value between '2010-01-01' and '2010-01-07') as subquery LEFT JOIN table a ON a.id = subquery.id

    Read the article

  • How can I filter a JTable?

    - by Jonas
    I would like to filter a JTable, but I don't understand how I can do it. I have read How to Use Tables - Sorting and Filtering and I have tried with the code below, but with that filter, no rows at all is shown in my table. And I don't understand what column it is filtered on. private void myFilter() { RowFilter<MyModel, Object> rf = null; try { rf = RowFilter.regexFilter(filterFld.getText(), 0); } catch (java.util.regex.PatternSyntaxException e) { return; } sorter.setRowFilter(rf); } MyModel has three columns, the first two are strings and the last column is of type Integer. How can I apply the filter above, consider the text in filterFld.getText() and only filter the rows where the text is matched on the second column? I would like to show all rows that starts with the text specified by filterFld.getText(). I.e. if the text is APP then the JTable should contain the rows where the second column starts with APPLE, APPLICATION but not the rows where the second column is CAR, ORANGE. I have also tried with this filter: RowFilter<MyModel, Integer> itemFilter = new RowFilter<MyModel, Integer>(){ public boolean include(Entry<? extends MyModel, ? extends Integer> entry){ MyModel model = entry.getModel(); MyItem item = model.getRecord(entry.getIdentifier()); if (item.getSecondColumn().startsWith("APP")) { return true; } else { return false; } } }; How can I write a filter that is filtering the JTable on the second column, specified by my textfield?

    Read the article

  • Intern working for Indian NGO - Help with PHP 4, advising staff

    - by Kevin Burke
    Hello, For the past three months I've been working for an Indian NGO (http://sevamandir.org), doing some volunteer work in the field but also trying to improve their website, which needs a ton of work. Recently I've been trying to fix the "subscribe to newsletter" button, which is broken. I used filter_var to filter the email input, but when I tried to test this out I got an error. Then I learned that the web host is still using php version 4.3.2 and register_globals is turned on. I've mentioned that they should upgrade their web host before (they are paying around $50 per year for Rediff Web Hosting, complete with 100MB storage and 1 MySQL database). That would add a lot of complexity for the IT staff of 3, who would have to update everyone's email information (I assume? this is a 250-person organization), and have me find a new web host and teach them about it. The staff isn't that sophisticated about web usage - the head guy still uses IE6, and the website's laid out in tables (they use Dreamweaver WYSIWYG to lay out pages). So I've got two options - use regular expressions to filter the email, which I'm not that skilled at doing (and would be more vulnerable to exploitation after I leave), turn off register globals and then try to teach the staff what I'm doing, or try to get them to upgrade their versions of PHP and MySQL and/or change web host. I'd appreciate some advice. Thanks for your help, Kevin

    Read the article

  • How do you efficiently implement a document similarity search system?

    - by Björn Lindqvist
    How do you implement a "similar items" system for items described by a set of tags? In my database, I have three tables, Article, ArticleTag and Tag. Each Article is related to a number of Tags via a many-to-many relationship. For each Article i want to find the five most similar articles to implement a "if you like this article you will like these too" system. I am familiar with Cosine similarity and using that algorithm works very well. But it is way to slow. For each article, I need to iterate over all articles, calculate the cosine similarity for the article pair and then select the five articles with the highest similarity rating. With 200k articles and 30k tags, it takes me half a minute to calculate the similar articles for a single article. So I need another algorithm that produces roughly as good results as cosine similarity but that can be run in realtime and which does not require me to iterate over the whole document corpus each time. Maybe someone can suggest an off-the-shelf solution for this? Most of the search engines I looked at does not enable document similarity searching.

    Read the article

  • Reading,Writing, Editing BLOBS through DataTables and DataRows

    - by Soham
    Consider this piece of code: DataSet ds = new DataSet(); SQLiteDataAdapter Da = new SQLiteDataAdapter(Command); Da.Fill(ds); DataTable dt = ds.Tables[0]; bool PositionExists; if (dt.Rows.Count > 0) { PositionExists = true; } else { PositionExists = false; } if (PositionExists) { //dt.Rows[0].Field<>("Date") } Here the "Date" field is a BLOB. My question is, a. Will reading through the DataAdapter create any problems later on, when I am working with BLOBS? More, specifically, will it read the BLOB properly? b. This was the read part.Now when I am writing the BLOB to the DB,its a queue actually. I.e I am trying to store a queue in MySQLite using a BLOB. Will Conn.ExecuteNonQuery() serve my purpose? c. When I am reading the BLOB back from the DB, can I edit it as the original datatype, it used to be in C# environment i.e { Queue - BLOB - ? } {C# -MySQL - C# } So in this context, Date field was a queue, I wrote it back as a BLOB, when reading it back, can I access it(and edit) as a queue? Thank You. Soham

    Read the article

  • SQL Server CLR stored procedures in data processing tasks - good or evil?

    - by Gart
    In short - is it a good design solution to implement most of the business logic in CLR stored procedures? I have read much about them recently but I can't figure out when they should be used, what are the best practices, are they good enough or not. For example, my business application needs to parse a large fixed-length text file, extract some numbers from each line in the file, according to these numbers apply some complex business rules (involving regex matching, pattern matching against data from many tables in the database and such), and as a result of this calculation update records in the database. There is also a GUI for the user to select the file, view the results, etc. This application seems to be a good candidate to implement the classic 3-tier architecture: the Data Layer, the Logic Layer, and the GUI layer. The Data Layer would access the database The Logic Layer would run as a WCF service and implement the business rules, interacting with the Data Layer The GUI Layer would be a means of communication between the Logic Layer and the User. Now, thinking of this design, I can see that most of the business rules may be implemented in a SQL CLR and stored in SQL Server. I might store all my raw data in the database, run the processing there, and get the results. I see some advantages and disadvantages of this solution: Pros: The business logic runs close to the data, meaning less network traffic. Process all data at once, possibly utilizing parallelizm and optimal execution plan. Cons: Scattering of the business logic: some part is here, some part is there. Questionable design solution, may encounter unknown problems. Difficult to implement a progress indicator for the processing task. I would like to hear all your opinions about SQL CLR. Does anybody use it in production? Are there any problems with such design? Is it a good thing?

    Read the article

  • Can't change pivot table's Access data source - bug in Excel 2000 SP3?

    - by Ron West
    I have a set of Excel 2000 SP3 worksheets that have Pivot Tables that get data from an Access 2000 SP3 database created by a contractor who left our company. Unfortunately, he did all his work on his private area on the company (Novell) network and now that he has left us, the drive spec has been deleted and is invalid. We were able to get the database files restored to our network area by our IT Service Desk people, but we now have to re-link everything to point to our group area instead of the now-nonexistent private area. If I follow the advice given elsewhere on this site (open wizard, click 'Back' to get to 'Step 2 of 3', click 'Get Data...' I get a message that the old filespec is an invalid path and I need to check that the path name is invalid and that I am connected to the server on which the file resides. I then click on OK and get a Login dialog with a 'Database...' button on the right. I click this and get a 'Select Database' dialog which allows me to choose the appropriate database in its correct new location. I then click OK, which takes me back to the 'Login' screen. I can confirm that it has accepted my new location by clicking on 'Database...' as before and the NEW location is still shown. So far so good - but if I then click on OK I get two unhelpful messages - first I get one saying that Excel 'Could not use '|'; file already in use.' - although no other files are in use. Clicking on OK takes me back to the 'Login' dialog. Clicking OK again gives me the same message as before telling me that the OLD filespec is invalid (as if I hadn't changed anything) - but clicking on the 'Database...' button shows that the correct (NEW) database location is still selected. Can anyone tell me a way of using VBA to change the link information without having to spend hours fighting the PivotTable Wizard - preferably similar to this way you update an Access Tabledef:- db.TableDefs(strLinkName).Connect = strNewLink db.TableDefs(strLinkName).RefreshLink Thanks!

    Read the article

  • Templates vs. coded HTML

    - by Alan Harris-Reid
    I have a web-app consisting of some html forms for maintaining some tables (SQlite, with CherryPy for web-server stuff). First I did it entirely 'the Python way', and generated html strings via. code, with common headers, footers, etc. defined as functions in a separate module. I also like the idea of templates, so I tried Jinja2, which I find quite developer-friendly. In the beginning I thought templates were the way to go, but that was when pages were simple. Once .css and .js files were introduced (not necessarily in the same folder as the .html files), and an ever-increasing number of {{...}} variables and {%...%} commands were introduced, things started getting messy at design-time, even though they looked great at run-time. Things got even more difficult when I needed additional javascript in the or sections. As far as I can see, the main advantages of using templates are: Non-dynamic elements of page can easily be viewed in browser during design. Except for {} placeholders, html is kept separate from python code. If your company has a web-page designer, they can still design without knowing Python. while some disadvantages are: {{}} delimiters visible when viewed at design-time in browser Associated .css and .js files have to be in same folder to see effects in browser at design-time. Data, variables, lists, etc., must be prepared in advanced and either declared globally or passed as parameters to render() function. So - when to use 'hard-coded' HTML, and when to use templates? I am not sure of the best way to go, so I would be interested to hear other developers' views. TIA, Alan

    Read the article

  • Linq to SQL with INSTEAD OF Trigger and an Identity Column

    - by Bob Horn
    I need to use the clock on my SQL Server to write a time to one of my tables, so I thought I'd just use GETDATE(). The problem is that I'm getting an error because of my INSTEAD OF trigger. Is there a way to set one column to GETDATE() when another column is an identity column? This is the Linq-to-SQL: internal void LogProcessPoint(WorkflowCreated workflowCreated, int processCode) { ProcessLoggingRecord processLoggingRecord = new ProcessLoggingRecord() { ProcessCode = processCode, SubId = workflowCreated.SubId, EventTime = DateTime.Now // I don't care what this is. SQL Server will use GETDATE() instead. }; this.Database.Add<ProcessLoggingRecord>(processLoggingRecord); } This is the table. EventTime is what I want to have as GETDATE(). I don't want the column to be null. And here is the trigger: ALTER TRIGGER [Master].[ProcessLoggingEventTimeTrigger] ON [Master].[ProcessLogging] INSTEAD OF INSERT AS BEGIN SET NOCOUNT ON; SET IDENTITY_INSERT [Master].[ProcessLogging] ON; INSERT INTO ProcessLogging (ProcessLoggingId, ProcessCode, SubId, EventTime, LastModifiedUser) SELECT ProcessLoggingId, ProcessCode, SubId, GETDATE(), LastModifiedUser FROM inserted SET IDENTITY_INSERT [Master].[ProcessLogging] OFF; END Without getting into all of the variations I've tried, this last attempt produces this error: InvalidOperationException Member AutoSync failure. For members to be AutoSynced after insert, the type must either have an auto-generated identity, or a key that is not modified by the database after insert. I could remove EventTime from my entity, but I don't want to do that. If it was gone though, then it would be NULL during the INSERT and GETDATE() would be used. Is there a way that I can simply use GETDATE() on the EventTime column for INSERTs? Note: I do not want to use C#'s DateTime.Now for two reasons: 1. One of these inserts is generated by SQL Server itself (from another stored procedure) 2. Times can be different on different machines, and I'd like to know exactly how fast my processes are happening.

    Read the article

  • How do I get syncdb db_table and app_label to play nicely together

    - by Chris Heisel
    I've got a model that looks something like this: class HeiselFoo(models.Model): title = models.CharField(max_length=250) class Meta: """ Meta """ app_label = "Foos" db_table = u"medley_heiselfoo_heiselfoo" And whenever I run my test suite, I get an error because Django isn't creating the tables for that model. It appears to be an interaction between app_label and db_table -- as the test suite runs normally if db_table is set, but app_label isn't. Here's a link to the full source code: http://github.com/cmheisel/heiselfoo Here's the traceback from the test suite: E ====================================================================== ERROR: test_truth (heiselfoo.tests.HeiselFooTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/heiselfoo/tests.py", line 10, in test_truth f.save() File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/base.py", line 434, in save self.save_base(using=using, force_insert=force_insert, force_update=force_update) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/base.py", line 527, in save_base result = manager._insert(values, return_id=update_pk, using=using) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/manager.py", line 195, in _insert return insert_query(self.model, values, **kwargs) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/query.py", line 1479, in insert_query return query.get_compiler(using=using).execute_sql(return_id) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 783, in execute_sql cursor = super(SQLInsertCompiler, self).execute_sql(None) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 727, in execute_sql cursor.execute(sql, params) File "/Users/chris/Code/heiselfoo/ve/lib/python2.6/site-packages/django/db/backends/sqlite3/base.py", line 200, in execute return Database.Cursor.execute(self, query, params) DatabaseError: no such table: medley_heiselfoo_heiselfoo ---------------------------------------------------------------------- Ran 1 test in 0.004s FAILED (errors=1) Creating test database 'default'... No fixtures found. medley_heiselfoo_heiselfoo Destroying test database 'default'...

    Read the article

  • Large Reports for MSRS

    - by Greg Lorenz
    I have a report that needs to be able to render a very large amount of pages (about 4500 in this instance) in a web browser. The total time needed to finish on the report server from start time to end time is about 30 mins for the instance that I am looking at. Does anyone know what options exist for handling the rendering of such a large report in a web browser? In terms of looking into how this can be resolved I have already performed the following tasks. The report gets its data off of a database table that already has the data flattened to the point that the TimeDataRetrieval on the report server is 17812 or about 18 secs. The report itself has been reformatted to include the least expensive report objects that it can in order to render the data in the correct format. I basically consists of a table with about 4 nested tables and thats it. We were trying to accomplish this on a 2005 report server but continued to run into memory issues that were not feasible for our clients. In response to that we moved this onto a 2008 report server to take advantage of the fact that it uses the file system instead of memory and finally were able to get this to work without running out of the available memory but of course it takes much longer.

    Read the article

  • How to keep track of a private messaging system using MongoDB?

    - by luckytaxi
    Take facebook's private messaging system where you have to keep track of sender and receiver along w/ the message content. If I were using MySQL I would have multiple tables, but with MongoDB I'll try to avoid all that. I'm trying to come up with a "good" schema that can scale and is easy to maintain. If I were using mysql, I would have a separate table to reference the user and and message. See below ... profiles table user_id first_name last_name message table message_id message_body time_stamp user_message_ref table user_id (FK) message_id (FK) is_sender (boolean) With the schema listed above, I can query for any messages that "Bob" may have regardless if he's the recipient or sender. Now how to turn that into a schema that works with MongoDB. I'm thinking I'll have a separate collection to hold the messages. Problem is, how can I differentiate between the sender and the recipient? If Bob logs in, what do I query against? Depending on whether Bob initiated the email, I don't want to have to query against "sender" and "receiver" just to see if the message belongs to the user.

    Read the article

  • Sorting GridView Formed With Data Set

    - by nani
    Following Code is for Sorting GridView Formed With DataSet Source: http://www.highoncoding.com/Articles/176_Sorting_GridView_Manually_.aspx But it is not displaying any output. There is no problem in sql connection. I am unable to trace the error, please help me. Thank You. public partial class _Default : System.Web.UI.Page { private const string ASCENDING = " ASC"; private const string DESCENDING = " DESC"; private DataSet GetData() { SqlConnection cnn = new SqlConnection("Server=localhost;Database=Northwind;Trusted_Connection=True;"); SqlDataAdapter da = new SqlDataAdapter("SELECT TOP 5 firstname,lastname,hiredate FROM EMPLOYEES", cnn); DataSet ds = new DataSet(); da.Fill(ds); return ds; } public SortDirection GridViewSortDirection { get { if (ViewState["sortDirection"] == null) ViewState["sortDirection"] = SortDirection.Ascending; return (SortDirection)ViewState["sortDirection"]; } set { ViewState["sortDirection"] = value; } } protected void GridView1_Sorting(object sender, GridViewSortEventArgs e) { string sortExpression = e.SortExpression; if (GridViewSortDirection == SortDirection.Ascending) { GridViewSortDirection = SortDirection.Descending; SortGridView(sortExpression, DESCENDING); } else { GridViewSortDirection = SortDirection.Ascending; SortGridView(sortExpression, ASCENDING); } } private void SortGridView(string sortExpression, string direction) { // You can cache the DataTable for improving performance DataTable dt = GetData().Tables[0]; DataView dv = new DataView(dt); dv.Sort = sortExpression + direction; GridView1.DataSource = dv; GridView1.DataBind(); } } aspx page asp:GridView ID="GridView1" runat="server" AllowSorting="True" OnSorting="GridView1_Sorting" /asp:GridView

    Read the article

  • VSDBCMD deployment for additions to third party databases

    - by Sam
    We have some custom objects (stored procedures etc) in an SQL Server 2005 database belonging to an ERP system. The custom objects are in different schemas to the ERP objects. We're using Database Edition .dbproj projects and vsdbcmd deployment for all our custom application databases and would like to similarly manage our custom objects in the ERP database. It's not clear how this can be done without either: Importing all ERP objects (~4000 tables) into the .dbproj and manually keeping them in sync with ERP development. Visual Studio fell over the only time I tried importing these, so I've no idea whether it can actually handle a project of this size. Somehow excluding the ERP schemas (there are two) from the diff process to ensure they don't get dropped by vsdbcmd. I haven't found any documentation which suggests this is possible. I'm aware of the IgnoreDefaultSchema setting, but there are two schemas I need to ignore and I'm not comfortable with the 'default schema' approach - deployment by different users could be disasterous. Has anyone managed to successfully use .dbproj & vsdbcmd for custom additions to a third party database? If not, how do you manage SQL source control & deployment?

    Read the article

  • iPhone: TableView inside UIScrollview, show vaccillating scrollbars around

    - by karim
    Hi, I have added some table and other vies in a scrollview. scrolling in tables are working fine. But in the parent scrollview, when scrolling, a vacillating vertical scrollbars are shown, sometimes it even come to the middle of the screen. sometime show at the left side of the screen. and not limited to the vertical scrollbar region. When I set tje showsVerticalScrollIndicator = NO, it is not shown. But do you know why the scrollbar is moving around. DashboardView is a subclass of UIScrollView. dashboard=[[DashboardView alloc] initWithFrame:fullScreenRect]; dashboard.contentSize = CGSizeMake(320,700); // must do! dashboard.showsVerticalScrollIndicator = YES; dashboard.bounces = YES; self.view = dashboard; @implementation DashboardView (id)initWithFrame:(CGRect)frame { if (self = [super initWithFrame:frame]) { // Initialization code } return self; } (void)drawRect:(CGRect)rect { // Drawing code } (void) layoutSubviews{ NSArray *views = self.subviews; [UIView beginAnimations:@"CollapseExpand" context:nil]; [UIView setAnimationDuration:0.5]; [UIView setAnimationBeginsFromCurrentState:YES]; [UIView setAnimationCurve:UIViewAnimationCurveEaseIn]; UIView *view = [views objectAtIndex: 0]; CGRect rect = view.frame; for (int i = 1; i < [views count]; i++ ) { view = [views objectAtIndex:i]; view.frame = CGRectMake(rect.origin.x, rect.origin.y + rect.size.height, view.frame.size.width, view.frame.size.height); rect = view.frame; } [UIView commitAnimations]; }

    Read the article

  • Parsing SOAP XML in Oracle

    - by user258587
    Hi I am new to Oracle and I am working on something that needs to parse a SOAP request and save the address to DB Tables. I am using the XML parser in Oracle (XMLType) with XPath but am struggling since I can't figure out the way to parse the SOAP request because it has multiple namespaces. Could anyone give me an example? Thanks in advance!!! edit It would be a typical SOAP request similar to the one below. <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:soap="http://soap.service.****.com"> <soapenv:Header /> <soapenv:Body> <soap:UpdateElem> <soap:request> <soap:att1>123456789</soap:att1> <soap:att2 xsi:nil="true" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" /> <soap:att3>L</soap:att3> ..... </soap:request> </soap:UpdateElem> </soapenv:Body> </soapenv:Envelope> I need to retrieve parameters att1, att2... and save them in to a DB table.

    Read the article

  • Is there a way to effect user defined data types in MySQL?

    - by Dancrumb
    I have a database which stores (among other things), the following pieces of information: Hardware IDs BIGINTs Storage Capacities BIGINTs Hardware Names VARCHARs World Wide Port Names VARCHARs I'd like to be able to capture a more refined definition of these datatypes. For instance, the hardware IDs have no numerical significance, so I don't care how they are formatted when displayed. The Storage Capacities, however, are cardinal numbers and, at a user's request, I'd like to present them with thousands and decimal separators, e.g. 123,456.789. Thus, I'd like to refine BIGINT into, say ID_NUMBER and CARDINAL. The same with Hardware Names, which are simple text and WWPNs, which are hexstrings, e.g. 24:68:AC:E0. Thus, I'd like to refine VARCHAR into ENGLISH_WORD and HEXSTRING. The specific datatypes I made up are just for illustrative purposes. I'd like to keep all this information in one place and I'm wondering if anybody knows of a good way to hold this all in my MySQL table definitions. I could use the Comment field of the table definition, but that smells fishy to me. One approach would be to define the data structure elsewhere and use that definition to generate my CREATE TABLEs, but that would be a major rework of the code that I currently have, so I'm looking for alternatives. Any suggestions? The application language in use is Perl, if that helps.

    Read the article

  • Linq to SQL Problem System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager

    - by luckyluke
    I have a really tricky thing going up here. My project has around 100 tables and they are all mapped by LINQ. Everything works fine in a dev and test environment. These enviroments are MS Win 2008 r2 servers with SQL 2008 sp1 databases. IIS and SQL are on a different machines. Now on production enviroment which is MS Win 2003 x64 web farm + geoclustered SQL 2008 IT DOES not work. All I get is the exception System.IndexOutOfRangeException: Index was outside the bounds of the array. at System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager3.TryCreateKeyFr>om Values(Object[] values, MultiKey& k) at System.Data.Linq.IdentityManager.StandardIdentityManager.IdentityCache2.Find(Object[] keyValues) at System.Data.Linq.ChangeProcessor.GetOtherItem(MetaAssociation assoc, Object instance) at System.Data.Linq.ChangeProcessor.BuildEdgeMaps() at System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode) at System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode) at ERS.IIMP.Services.ExposuresSrv.Update(Int32 ExpID, Int32 AssID) Services\ExposuresSrv.cs` My question is What the hell. They have precisely the same DBML, the DB has exactly THE SAME structure (when I get the DB from prod to TEST and mount it eveything works just great), the binaries on the WEB Server are the same. I seriously do not know what to do.... Did anyone found that Linq works on one env and does not on the second?? I mam really lost here. I really hope You can help me:)

    Read the article

< Previous Page | 333 334 335 336 337 338 339 340 341 342 343 344  | Next Page >