Search Results

Search found 18272 results on 731 pages for 'ldap query'.

Page 433/731 | < Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >

  • confusion about using types instead of gtts in oracle

    - by Omnipresent
    I am trying to convert queries like below to types so that I won't have to use GTT: insert into my_gtt_table_1 (house, lname, fname, MI, fullname, dob) (select house, lname, fname, MI, fullname, dob from (select 'REG' house, mbr_last_name lname, mbr_first_name fname, mbr_mi MI, mbr_first_name || mbr_mi || mbr_last_name fullname, mbr_dob dob from table_1 a, table_b where a.head = b.head and mbr_number = '01' and mbr_last_name = v_last_name) c above is just a sample but complex queries are bigger than this. the above is inside a stored procedure. So to avoid the gtt (my_gtt_table_1). I did the following: create or replace type lname_row as object ( house varchar2(30) lname varchar2(30), fname varchar2(30), MI char(1), fullname VARCHAR2(63), dob DATE ) create or replace type lname_exact as table of lname_row Now in the SP: type lname_exact is table of <what_table_should_i_put_here>%rowtype; tab_a_recs lname_exact; In the above I am not sure what table to put as my query has nested subqueries. query in the SP: (I am trying this for sample purpose to see if it works) select lname_row('', '', '', '', '', '', sysdate) bulk collect into tab_a_recs from table_1; I am getting errors like : ORA-00913: too many values I am really confused and stuck with this :(

    Read the article

  • NHibernate correct way to reattach cached entity to different session

    - by Chris Marisic
    I'm using NHibernate to query a list of objects from my database. After I get the list of objects over them I iterate over the list of objects and apply a distance approximation algorithm to find the nearest object. I consider this function of getting the list of objects and apply the algorithm over them to be a heavy operation so I cache the object which I find from the algorithm in HttpRuntime.Cache. After this point whenever I'm given the supplied input again I can just directly pull the object from Cache instead of having to hit the database and traverse the list. My object is a complex object that has collections attached to it, inside the query where I return the full list of objects I don't bring back any of the sub collections eagerly so when I read my cached object I need lazy loading to work correctly to be able to display the object fully. Originally I tried using this to re-associate my cached object back to a new session _session.Lock(obj, LockMode.None); However when accessing the page concurrently from another instance I get the error Illegal attempt to associate a collection with two open sessions I then tried something different with _session.Merge(obj); However watching the output of this in NHProf shows that it is deleting and re-associating my object's contained collections with my object, which is not what I want although it seems to work fine. What is the correct way to do this? Neither of these seem to be right.

    Read the article

  • How to calculate commision based on referred memebrs?

    - by RAJKISHOR SAHU
    Hello everybody, I am developing a small software in C# WPF for a consultancy which does chain system business. I have coded tree structure to show who referred whom. Now it has commission depending on level. If 1 referred 2 & 3 then 1 will get level-1 commission. If 2 referred 4, 5 & 3 referred 6, 7 then 1 will receive level-2 commission. This chain will go on to certain total number. My problem is how I would implement this logic; I am able to calculate who has referred how many members via UDF written for adding TreeViewItem to TreeView. Or tell me how I can count items in treeview in certain level? Node adding UDF: public void AddNodes(int uid, TreeViewItem tSubNode) { string query = "select fullname, id from members where refCode=" + uid + ";"; MySqlCommand cmd = new MySqlCommand(query, db.conn); MySqlDataAdapter _DA = new MySqlDataAdapter(cmd); DataTable _DT = new DataTable(); tSubNode.IsExpanded = true; _DA.Fill(_DT); foreach (DataRow _dr in _DT.Rows) { TreeViewItem tNode = new TreeViewItem(); tNode.Header = _dr["fullname"].ToString()+" ("+_dr["id"].ToString()+")"; tSubNode.Items.Add(tNode); if (db.HasMembers(Convert.ToInt32(_dr["id"].ToString()))) { AddNodes(Convert.ToInt32(_dr["id"]), tNode); } } //This line tracks who has referred how many members Console.WriteLine("Tree node Count : "+tSubNode.Items.Count.ToString()+", UID "+uid); } Help me PLEASE!!!!

    Read the article

  • How to eliminate duplicate rows?

    - by Odette
    hi guys im back with my original query and i just have one question please (ps: I know i have to vote and regsiter and I promise I will do that today) With the following query (t-sql) I am getting the correct results, except that there are duplicates now. I have been reading up and think I can use the PARTITION BY syntax - can you please show me how to incorporate the PARTITION BY syntax? WITH CALC1 AS (SELECT OTQUOT, OTIT01 AS ITEMS, ROUND(OQCQ01 * OVRC01,2) AS COST FROM @[email protected] WHERE OTIT01 < '' UNION ALL ... SELECT OTQUOT, OTIT10 AS ITEMS, ROUND(OQCQ10 * OVRC10,2) AS COST FROM @[email protected] WHERE OTIT10 < '' ) SELECT OTQUOT, DESC, ITEMS, RN FROM ( SELECT OTQUOT, ITEMS, B.IXRPGP AS GROUP, C.OTRDSC AS DESC, COST, ROW_NUMBER() OVER (PARTITION BY OTQUOT ORDER BY COST DESC) AS RN FROM CALC1 AS A INNER JOIN @[email protected] AS B ON (A.ITEMS = B.IKITMC) INNER JOIN DATAGRP.GDSGRP AS C ON (B.IXRPGP = C.OKRPGP) ) T RESULTS: 60408169 FENCING GNCPDCTP18BGBG 1 60408169 FENCING CGIFESHPD1795BG 2 60408169 FENCING GTTCGIBG 3 60408169 FENCING GBTCGIBG 4 How do I get rid of the duplicates? thanks Bill and all the others for your help (I am still learning!)

    Read the article

  • Doctrine2 - relationship

    - by Filip Golonka
    I'm developing an application, which is looking for optimal route and timetable in public transport. I have some experience about Doctrine1, but it's my first time with Doctrine2. There is soem new fields to describe relations (mappedBy and inversedBy) and also some new ways of mapping. I have following code: $query = $this->em->createQuery("SELECT partial cls.{stop}, partial t.{arriveTime, departureTime} FROM \Entities\Timetable t JOIN t.ride r JOIN t.carrierLineStop cls WHERE t.departureTime>=:time AND r.idCarrierLine=:carrierLine AND (cls.idStop=:firstStop OR cls.idStop=:lastStop)"); $query->setParameters(array( 'time' => $time, 'carrierLine' => $path->getLine(), 'firstStop' => $path->getFirstStop(), 'lastStop' => $path->getLastStop() )); When I try to execute that script I've got an error: [Semantical Error] line 0, col 24 near '}, partial t.{arriveTime,': Error: There is no mapped field named 'stop' on class Entities\CarrierLineStop. Mapping files: Entities\CarrierLineStop: type: entity table: carrier_line_stop fields: idCarrierLineStop: id: true type: integer unsigned: false nullable: false column: id_carrier_line_stop generator: strategy: IDENTITY nextStop: type: integer unsigned: false nullable: true column: next_stop manyToOne: idCarrierLine: targetEntity: Entities\CarrierLine cascade: { } mappedBy: null inversedBy: null joinColumns: id_carrier_line: referencedColumnName: id_carrier_line orphanRemoval: false stop: column: id_stop targetEntity: Entities\Stop cascade: { } mappedBy: null inversedBy: carrierLineStop joinColumns: id_stop: referencedColumnName: id_stop orphanRemoval: false lifecycleCallbacks: { } - Entities\Stop: type: entity table: stop fields: idStop: id: true type: integer unsigned: false nullable: false column: id_stop generator: strategy: IDENTITY name: type: string length: 45 fixed: false nullable: true miejscowosc: type: string length: 45 fixed: false nullable: true latitude: type: decimal nullable: true longitude: type: decimal nullable: true oneToMany: carrierLineStop: targetEntity: Entities\CarrierLineStop cascade: { } mappedBy: stop inversedBy: null joinColumns: id_stop: referencedColumnName: id_stop orphanRemoval: false lifecycleCallbacks: { } I have no idea about where the problem is...

    Read the article

  • php paging and the use of limit clause

    - by Average Joe
    Imagine you got a 1m record table and you want to limit the search results down to say 10,000 and not more than that. So what do I use for that? Well, the answer is use the limit clause. example select recid from mytable order by recid asc limit 10000 This is going to give me the last 10,000 records entered into this table. So far no paging. But the limit phrase is already in use. That brings to question to the next level. What if I want to page thru this record particular record set 100 recs at a time? Since the limit phrase is already part of the original query, how do I use it again, this time to take care of the paging? If the org. query did not have a limit clause to begin with, I'd adjust it as limit 0,100 and then adjusting it as 100,100 and then 200,100 and so on while the paging takes it course. But at this time, I cannot. You almost think you'd want to use two limit phrases one after the other - which is not not gonna work. limit 10000 limit 0,1000 for sure it would error out. So what's the solution in this particular case?

    Read the article

  • EF/LINQ: Where() against a property of a subtype

    - by ladenedge
    I have a set of POCOs, all of which implement the following simple interface: interface IIdObject { int Id { get; set; } } A subset of these POCOs implement this additional interface: interface IDeletableObject : IIdObject { bool IsDeleted { get; set; } } I have a repository hierarchy that looks something like this: IRepository<T <: BasicRepository<T <: ValidatingRepository<T (where T is IIdObject) I'm trying to add a FilteringRepository to the hierarchy such that all of the POCOs that implement IDeletableObject have a Where(p => p.IsDeleted == false) filter applied before any other queries take place. My goal is to avoid duplicating the hierarchy solely for IDeletableObjects. My first attempt looked like this: public override IQueryable<T> Query() { return base.Query().Where(t => ((IDeletableObject)t).IsDeleted == false); } This works well with LINQ to Objects, but when I switch to an EF backend I get: "LINQ to Entities only supports casting Entity Data Model primitive types." I went on to try some fancier parameterized solutions, but they ultimately failed because I couldn't make T covariant in the following case for some reason I don't quite understand: interface IQueryFilter<out T> // error { Expression<Func<T, bool>> GetFilter(); } I'd be happy to go into more detail on my more complicated solutions if it would help, but I think I'll stop here for now in hope that someone might have an idea for me to try. Thanks very much in advance!

    Read the article

  • Update table without using cursor and on date

    - by Muhammad Kashif Nadeem
    Please copy and run following script DECLARE @Customers TABLE (CustomerId INT) DECLARE @Orders TABLE ( OrderId INT, CustomerId INT, OrderDate DATETIME ) DECLARE @Calls TABLE (CallId INT, CallTime DATETIME, CallToId INT, OrderId INT) ----------------------------------------------------------------- INSERT INTO @Customers SELECT 1 INSERT INTO @Customers SELECT 2 ----------------------------------------------------------------- INSERT INTO @Orders SELECT 10, 1, DATEADD(d, -20, GETDATE()) INSERT INTO @Orders SELECT 11, 1, DATEADD(d, -10, GETDATE()) ----------------------------------------------------------------- INSERT INTO @Calls SELECT 101, DATEADD(d, -19, GETDATE()), 1, NULL INSERT INTO @Calls SELECT 102, DATEADD(d, -17, GETDATE()), 1, NULL INSERT INTO @Calls SELECT 103, DATEADD(d, -9, GETDATE()), 1, NULL INSERT INTO @Calls SELECT 104, DATEADD(d, -6, GETDATE()), 1, NULL INSERT INTO @Calls SELECT 105, DATEADD(d, -5, GETDATE()), 1, NULL ----------------------------------------------------------------- I want to update @Calls table and need following results. I am using the following query UPDATE @Calls SET OrderId = ( CASE WHEN (s.CallTime > e.OrderDate) THEN e.OrderId END ) FROM @Calls s INNER JOIN @Orders e ON s.CallToId = e.CustomerId and the result of my query is not what I need. Requirement: As you can see there are two orders. One is on 2010-12-12 and one is on 2010-12-22. I want to update @Calls table with relevant OrderId with respect to CallTime. In short If subsequent Orders are added, and there are further calls then we assume that a new call is associated with the most recent Order Note: This is sample data so this is not the case that I always have two Orders. There might be 10+ Orders and 100+ calls. Note2 I could not find good title for this question. Please change it if you think of any better. Thanks.

    Read the article

  • php request variables assigning $_GEt

    - by chris
    if you take a look at a previous question http://stackoverflow.com/questions/2690742/mod-rewrite-title-slugs-and-htaccess I am using the solution that Col. Shrapnel proposed- but when i assign values to $_GET in the actual file and not from a request the code doesnt work. It defaults away from the file as if the $_GET variables are not set The code I have come up with is- if(!empty($_GET['cat'])){ $_GET['target'] = "category"; if(isset($_GET['page'])){ $_GET['pageID'] = $_GET['page']; } $URL_query = "SELECT category_id FROM cats WHERE slug = '".$_GET['cat']."';"; $URL_result = mysql_query($URL_query); $URL_array = mysql_fetch_array($URL_result); $_GET['category_id'] = $URL_array['category_id']; }elseif($_GET['product']){ $_GET['target'] = "product"; $URL_query = "SELECT product_id FROM products WHERE slug = '".$_GET['product']."';"; $URL_result = mysql_query($URL_query); $URL_array = mysql_fetch_array($URL_result); print_r($URL_array); $_GET['product_id'] = $URL_array['product_id']; The original variable string that im trying to represent is /cart.php?Target=product&product_id=16142&category_id=249 And i'm trying to build the query string variables with code and including cart.php so i can use cleaner URL's So I have product/product-title-with-clean-url/ going to slug.php?product=slug Then the slug searches the db for a record with the matching slug and returns the product_id as in the code above.Then built the query string and include cart.php

    Read the article

  • Allowing access to company files accross the internet

    - by Renaud Bompuis
    The premise I've been tasked with finding a solution to the following scenario: our main file server is a Linux machine. on the LAN, users simply access the files using SMB. each user has an account on the file server and his/her own access rights. user accounts are simple passwd/group security accounts, not NIS/LDAP. The problem We want to give users (or at least some of them, say if they belong to a particular group) the ability to access the files from the Internet while travelling. Ideally I'd like a seamless solution. Maybe something that allows the user to access a mapped drive would be ideal. A web-oriented solution is also good but it should present files in a way that is familiar to users, in an explorer-like fashion for instance. Security is a must of course, and users would be expected to log-in. The connection to the server should also be encrypted. Anyone has some pointers to neat solutions? Any experiences? Edit The client machines are Windows only.

    Read the article

  • Using reflection to find all linq2sql tables and ensure they match the database

    - by Jake Stevenson
    I'm trying to use reflection to automatically test that all my linq2sql entities match the test database. I thought I'd do this by getting all the classes that inherit from DataContext from my assembly: var contexttypes = Assembly.GetAssembly(typeof (BaseRepository<,>)).GetTypes().Where( t => t.IsSubclassOf(typeof(DataContext))); foreach (var contexttype in contexttypes) { var context = Activator.CreateInstance(contexttype); var tableProperties = type.GetProperties().Where(t=> t.PropertyType.Name == typeof(ITable<>).Name); foreach (var propertyInfo in tableProperties) { var table = (propertyInfo.GetValue(context, null)); } } So far so good, this loops through each ITable< in each datacontext in the project. If I debug the code, "table" is properly instantiated, and if I expand the results view in the debugger I can see actual data. BUT, I can't figure out how to get my code to actually query that table. I'd really like to just be able to do table.FirstOrDefault() to get the top row out of each table and make sure the SQL fetch doesn't fail. But I cant cast that table to be anything I can query. Any suggestions on how I can make this queryable? Just the ability to call .Count() would be enough for me to ensure the entities don't have anything that doesn't match the table columns.

    Read the article

  • LINQ to SQL: Reusable expression for property?

    - by coenvdwel
    Pardon me for being unable to phrase the title more exact. Basically, I have three LINQ objects linked to tables. One is Product, the other is Company and the last is a mapping table Mapping to store what Company sells which products and by which ID this Company refers to this Product. I am now retrieving a list of products as follows: var options = new DataLoadOptions(); options.LoadWith<Product>(p => p.Mappings); context.LoadOptions = options; var products = ( from p in context.Products select new { ProductID = p.ProductID, //BackendProductID = p.BackendProductID, BackendProductID = (p.Mappings.Count == 0) ? "None" : (p.Mappings.Count > 1) ? "Multiple" : p.Mappings.First().BackendProductID, Description = p.Description } ).ToList(); This does a single query retrieving the information I want. But I want to be able to move the logic behind the BackendProductID into the LINQ object so I can use the commented line instead of the annoyingly nested ternary operator statements for neatness and re-usability. So I added the following property to the Product object: public string BackendProductID { get { if (Mappings.Count == 0) return "None"; if (Mappings.Count > 1) return "Multiple"; return Mappings.First().BackendProductID; } } The list is still the same, but it now does a query for every single Product to get it's BackendProductID. The code is neater and re-usable, but the performance now is terrible. What I need is some kind of Expression or Delegate but I couldn't get my head around writing one. It always ended up querying for every single product, still. Any help would be appreciated!

    Read the article

  • CFfile -- value is not set to the queried data

    - by user494901
    I have this add user form, it also doubles as a edit user form by querying the data and setting the value="#query.xvalue#". If the user exists (eg, you're editing a user, it loads in the users data from the database. When doing this on the <cffile field it does not load in the data, then when the insert goes to insert data it overrights the database values with a blank string (If a user does not input a new file). How do I avoid this? Code: Form: <br/>Digital Copy<br/> <!--- If null, set a default if not, set the default to database default ---> <cfif len(Trim(certificationsList.cprAdultImage)) EQ 0> <cfinput type="file" required="no" name="cprAdultImage" value="" > <cfelse> File Exists: <cfoutput><a href="#certificationsList.cprAdultImage#">View File</a></cfoutput> <cfinput type="file" required="no" name="cprAdultImage" value="#certificationsList.cprAdultImage#"> </cfif> Form Processor: <!--- Has a file been specificed? ---> <cfif not len(Trim(form.cprAdultImage)) EQ 0> <cffile action="upload" filefield="cprAdultImage" destination="#destination#" nameConflict="makeUnique"> <cfinvokeargument name="cprAdultImage" value="#pathOfFile##cffile.serverFile#"> <cfelse> <cfinvokeargument name="cprAdultImage" value=""> </cfif> CFC ARGS: <cfargument name="cprAdultExp" required="NO"> <cfargument name="cprAdultCompany" type="string" required="no"> <cfargument name="cprAdultImage" type="string" required="no"> <cfargument name="cprAdultOnFile" type="boolean" required="no"> Query: UPDATE mod_StudentCertifications SET cprAdultExp='#DateFormat(ARGUMENTS.cprAdultExp, "mm/dd/yyyy")#', cprAdultCompany='#Trim(ARGUMENTS.cprAdultCompany)#', cprAdultImage='#Trim(ARGUMENTS.cprAdultImage)#', cprAdultOnFile='#Trim(ARGUMENTS.cprAdultOnFile)#' INSERT INTO mod_StudentCertifications( cprAdultExp, cprAdultcompany, cprAdultImage, cprAdultOnFile

    Read the article

  • How to keep form items selected after post request?

    - by Ole Jak
    I have a simple html form. On php page. A simple list is placed on form. I submit this form (selected list items) to this page so it gives me page refresh. I want items which were POSTED to be selected after form was submited. How to do such thing? For my form I use such code: <form action="FormPage.php" method="post"> <select id="Streams" class="multiselect ui-widget-content ui-corner-all" multiple="multiple" name="Streams[]"> <?php $query = " SELECT s.streamId, s.userId, u.username FROM streams AS s JOIN user AS u ON s.userId = u.id LIMIT 0 , 30 "; $streams_set = mysql_query($query, $connection); confirm_query($streams_set); $streams_count = mysql_num_rows($streams_set); while ($row = mysql_fetch_array($streams_set)){ echo '<option value="' , $row['streamId'] , '"> ' , $row['username'] , ' (' , $row['streamId'] ,')' ,'</option> '; } ?> </select> <br/> <input type="submit" class="ui-state-default ui-corner-all" name="submitForm" id="submitForm" value="Play Stream from selected URL's!"/> </fieldset> </form>

    Read the article

  • db4o problem with graph of objects

    - by Staszek28
    I am a new to db4o. I have a big problem with persistance of a graph of objects. I am trying to migrate from old persistance component to new, using db4o. Before I peristed all objects its graph looked like below (Take a look at Zrodlo.Metadane.abstrakt string field with focused value) [its view from eclipse debuger] with a code: ObjectContainer db=Db4o.openFile(DB_FILE); try { db.store(encja); db.commit(); } finally{ db.close(); } After that, I tried to read it with a code: ObjectContainer db=Db4o.openFile((DB_FILE)); try{ Query q = db.query(); q.constrain(EncjaDanych.class); ObjectSet<Object> objectSet = q.execute(); logger.debug("objectSet.size" + objectSet.size()); EncjaDanych encja = (EncjaDanych) objectSet.get(0); logger.debug("ENCJA" + encja.toString()); return encja; }finally{ db.close(); } and I got it (picture below) - string field "abstrakt" is null now !!! I take a look at it using ObjectManager (picture below) and abstrakt field has not-null value there!!! The same value, that on the 1st picture. Please help me :) It is my second day with db4o. Thanks in advance! I am attaching some code with structure of persisted class: public class EncjaDanych{ Map mapaIdRepo = new HashMap(); public Map mapaNazwaRepo = new HashMap(); }

    Read the article

  • Best way to create a SPARQL endpoint for a RDBMS (MySQL database)

    - by Ankur
    I am doing (want to do) some experiments with Linked Open Datasets particularly those put out by governments. I have a RDBMS (more specifically MySQL). I designed it with semantic web ideas in mind i.e. I have a information stored as objects, predicates and classes which define objects. In turn all objects are related to each other though statements of the form subject -- predicate -- object (where the subjects are from the objects table). I want to be able to query other RDF triple stores from my application and let other triple stores query my data. Is it possible to "set something up" so that this is possible? I have looked at Jena. Using Jena seems to mean I have to it as a storage application rather than MySQL - the only problem with this is that I include a new concept called a category (which I don't think is part of the semantic web languages). I will use categories to help with displaying information (they don't have any other meaning) but using Jena seems to mean that I can't organise predicates under categories for more convenient viewing. I am using Java so a JAVA API is preferred. It's also possible I misunderstood the purpose of Jena, and maybe that can be of use, but I am not sure how. I am sure four days from now this question will seem rather silly, but at the moment I am somewhat confused about how to proceed.

    Read the article

  • Copy recordset data into multiple sheets to avoid problem of maximum rows limit in Excel VBA

    - by Sam
    I am developing reporting application in Excel/vba 2003. VBA code sends search query to database and gets the data through recordset. It will then be copied to one of excel sheet. The rertrieved data looks like as shown below. ProductID--|---DateProcessed--|----State----- 1................|.. 1/1/2010..............|.....Picked Up 1................|.. 1/1/2010..............|.....Forward To Approver 1................|.. 1/2/2010..............|.....Approver Picked Up 1................|.. 1/3/2010..............|.....Approval Completed 2................|.. 1/1/2010..............|.....Picked Up 3................|.. 1/2/2010..............|.....Picked Up 3................|.. 1/2/2010..............|.....Forward To Approver The problem is data retrieved from search query is so huge that it goes above the excel row limit (65536 rows in excel 2003). So I want to split this data into two excel sheets. While spliting the data I want to ensure that the data for same product shoud remain in one sheet. For example, if the last record in the above result set is 65537th record then I also want to move all records for product 3 into new sheet. So sheet1 will contain records for product id 1 and 2 with total records = 65534. Sheet 2 will cotain records for product id 3 - with total records = 2. How can I acheive this in vba? If it is not possible, is there any alternative solution ? Thanks in Advance !

    Read the article

  • custom rss feed suddenly does not work anymore

    - by krike
    I just don't understand what's happening, I haven't change anything to the site for a few months but now suddenly the rss feed doesn't work anymore. I create a php file with the following code: header('Content-type: text/xml'); include("config/config.inc.php"); $result = mysqli_query($link, "SELECT * FROM tutorials ORDER BY tutorial_id DESC LIMIT 50"); ?> <rss version="2.0"> <channel> <title>CMS tutorial site</title> <description>Bringing you the best CMS tutorials from the web</description> <link>http://cmstutorials.org</link> <?php while($row = mysqli_fetch_object($result)) { $user = mysqli_fetch_object(mysqli_query($link, "SELECT * FROM user_extra WHERE userid=".$row->user_id."")); ?> <item> <title><?php echo $row->title; ?></title> <author><?php echo $user->username; ?></author> <description><?php echo $row->description; ?></description> <pubDate><?php echo $row->date; ?></pubDate> <link>http://cmstutorials.org/view_tutorial.php?tutorial_id=<?php echo $row->tutorial_id; ?></link> </item> <?php } ?> </channel> </rss> I checked the query by executing it in phpmyadmin and it works, doesn't give any error. When I delete the header content type and the rss tag it will print out each line from the query but the feed won't display anything this is the link of the feed http://cmstutorials.org/rss (or http://cmstutorials.org/rss.php)

    Read the article

  • Postfix auto create Maildir

    - by Eugene
    I've been beating my head against a wall for a while now on this one. Basically, here is the rundown: Our MX record points to a frontend SMTP server, which contains aliases for actually routing the mail. No alias, no access to the backend storage server, which is what our clients connect to. I'm upgrading the backend email server. Currently, a user is created for every email user on the server, which creates the mailbox. On the new server, everything autheticates through PAM to an LDAP server (all of which is working properly). My goal is to get Postfix to create the Maildir directory for the user automatically. This works fine when I have the /home directory with 777 permissions, but for obvious reasons, this should be avoided. I would like to do this with 775 permissions on /home with a group owner of whatever user Postfix is running as, but I can't seem to figure out what user to use. With the 777 permissions, the /home/$user/Maildir directory is created on message delivery. Does anybody know how I can do this without 777 permissions? The system I am working on is a 64-bit Debian Lenny 5.07 install. Any advice would be appreciated.

    Read the article

  • When should we use Views, Temporary Tables and Direct Queries ? What are the Performance issues in a

    - by Shantanu Gupta
    I want to know the performance of using Views, Temp Tables and Direct Queries Usage in a Stored Procedure. I have a table that gets created every time when a trigger gets fired. I know this trigger will be fired very rare and only once at the time of setup. Now I have to use that created table from triggers at many places for fetching data and I confirms it that no one make any changes in that table. i.e ReadOnly Table. I have to use this tables data along with multiple tables to join and fetch result for further queries say select * from triggertable By Using temp table select ... into #tx from triggertable join t2 join t3 and so on select a,b, c from #tx --do something select d,e,f from #tx ---do somethign --and so on --around 6-7 queries in a row in a stored procedure. By Using Views create view viewname ( select ... from triggertable join t2 join t3 and so on ) select a,b, c from viewname --do something select d,e,f from viewname ---do somethign --and so on --around 6-7 queries in a row in a stored procedure. This View can be used in other places as well. So I will be creating at database rather than at sp By Using Direct Query select a,b, c from select ... into #tx from triggertable join t2 join t3 join ... --do something select a,b, c from select ... into #tx from triggertable join t2 join t3 join ... --do something . . --and so on --around 6-7 queries in a row in a stored procedure. Now I can create a view/temporary table/ directly query usage in all upcoming queries. What would be the best to use in this case.

    Read the article

  • MYSQL JOIN WHERE ISSUES - need some kind of if condition

    - by Breezer
    Hi Well this will be hard to explain but ill do my best The thing is i have 4 tables all with a specific column to relate to eachother. 1 table with users(agent_users) , 1 with working hours(agent_pers), 1 with sold items(agent_stat),1 with project(agent_pro) the user and the project table is irrelevant in the issue at hand but to give you a better understanding why certain tables is included in my query i decided to still mention them =) The thing is that I use 2 pages to insert data to the working hour and the sold items during that time tables, then i have a third page to summarize everything for current month, the query for that is as following: SELECT *, SUM(sv_p_kom),SUM(sv_p_gick),SUM(sv_p_lunch) FROM (( agent_users LEFT JOIN agent_pers ON agent_users.sv_aid = agent_pers.sv_p_uid) LEFT JOIN agent_stat ON agent_pers.sv_p_uid = agent_stat.sv_s_uid) LEFT JOIN agent_pro ON agent_pers.sv_p_pid=agent_pro.p_id WHERE MONTH(agent_pers.sv_p_datum) =7 GROUP BY sv_aname so the problem is now that i dont want sold items from previous months to get included in the data received, i know i could solve that by simple adding in the WHERE part MONTH(agent_stat.sv_s_datum) =7 but then if no items been sold that month no data at all will show up not the time or anything. Any aid on how i could solve this is greatly appreciated. if there's something that's not so clear dont hesitate to ask and ill try my best to answer. after all my english isn't the best out there :P regards breezer

    Read the article

  • C# Dataset Dynamically Add DataColumn

    - by Wesley
    I am trying to add a extra column to a dataset after a query has completed. I have a database relationship of the following: Employees / \ Groups EmployeeGroups Empoyees holds all the data for that individual, I'll name the unique key the UserID. Groups holds all the groups that a employee can be a part of, i.e. Super User, Admin, User; etc. I will name the unique key GroupID EmployeeGroups holds all the associations of which groups each employee belongs too. (UserID | GroupID) What I am trying to accomplish is after querying for a all users I want to loop though each user and add what groups that user is a part of by adding a new column to the dataset named 'Groups' which is a string to insert the values of the next query to get all the groups that user is a part of. Then by user of databinding populate a listview with all employees and their group associations My code is as follows; Position 5 is the new column I am trying to add to the dataset. string theQuery = "select UserID, FirstName, LastName, EmployeeID, Active from Employees"; DataSet theEmployeeSet = itsDatabase.runQuery(theQuery); DataColumn theCol = new DataColumn("Groups", typeof(string)); theEmployeeSet.Tables[0].Columns.Add(theCol); foreach (DataRow theRow in theEmployeeSet.Tables[0].Rows) { theRow.ItemArray[5] = "1234"; } At the moment, the code will create the new column but when i assign the data to that column nothing will be assigned, what am I missing? If there is any further explination or information I can provide, please let me know. Thank you all

    Read the article

  • Not loading associations without proxies in NHibernate

    - by Alice
    I don't like the idea of proxy and lazy loading. I don't need that. I want pure POCO. And I want to control loading associations explicitly when I need. Here is entity public class Post { public long Id { get; set; } public long OwnerId { get; set; } public string Content { get; set; } public User Owner { get; set; } } and mapping <class name="Post"> <id name="Id" /> <property name="OwnerId" /> <property name="Content" /> <many-to-one name="Owner" column="OwnerId" /> </class> However if I specify lazy="false" in the mapping, Owner is always eagerly fetched. I can't remove many-to-one mapping because that also disables explicit loading or a query like from x in session.Query<Comment>() where x.Owner.Title == "hello" select x; I specified lazy="true" and set use_proxy_validator property to false. But that also eager loads Owner. Is there any way to load only Post entity?

    Read the article

  • What could possibly be different between the table in a DataContext and an IQueryable<Table> when do

    - by Nate Bross
    I have a table, where I need to do a case insensitive search on a text field. If I run this query in LinqPad directly on my database, it works as expected Table.Where(tbl => tbl.Title.Contains("StringWithAnyCase") In my application, I've got a repository which exposes IQueryable objects which does some initial filtering and it looks like this var dc = new MyDataContext(); public IQueryable<Table> GetAllTables() { var ret = dc.Tables.Where(t => t.IsActive == true); return ret; } In the controller (its an MVC app) I use code like this in an attempt to mimic the LinqPad query: var rpo = new RepositoryOfTable(); var tables = rpo.GetAllTables(); // for some reason, this does a CASE SENSITIVE search which is NOT what I want. tables = tables.Where(tbl => tbl.Title.Contains("StringWithAnyCase"); return View(tables); The column is defiend as an nvarchar(50) in SQL Server 2008. Any help or guidance is greatly appreciated!

    Read the article

  • SqlServer slow on production environment

    - by Lieven Cardoen
    I have a weird problem in a production environment at a customer. I can't give any details on the infrastructure except that sql server runs on a virtual server and the data, log and filestream file are on another storage server (data and filestream together and log on a seperate server). Now, there's this query that when we run it gives these durations (first we clear the cache): 300ms, 20ms, 15ms, 17ms,... First time it takes longer, but from then on it is cached. At the customer, on a sql server that is more powerfull, these are the durations (I didn't have the rights to clear the cache. Will try this tomorrow). 2500ms, 2600ms, 2400ms, ... The query can be improved, that's right, but that's not the question here. How would you tackle this? I don't know where to go from here. The servers at this customer are really more powerfull but they do have virtual servers (we don't). What could be the cause... - Not enough memory? - Fragmentation? - Physical storage? I know it can be a lot of things, but maybe some of you have got some info for me on how to go on with this issue...

    Read the article

< Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >