Search Results

Search found 40567 results on 1623 pages for 'database performance'.

Page 207/1623 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • JFFS2 poor mount performance

    - by Marcin Polkowski
    I run multiple ARM boards with Debian Linux installed. Board is equipped with 512 MB of NAND memory. I've observed that after ~3 months of continuous run booting time increased significantly - it takes over 3 minutes to mount filesystem (JFFS2). System was using about 35% of available storage so I’ve removed unnecessary files (got to ~18%) but this didn't change anything. Then I realized that my software produces directories that are left empty so I’ve removed ~500 empty and unnecessary dirs. This didn’t help either. After system is started I see JFFS2 garbage collector (jffs2_gcd_mtd4) running and occupying over 90% of CPU. Now my question: is there a way to „optimize” JFFS2 filesystem for better performance - faster booting (my system have limited timid to boot up)? It would be great if this optimization could be done remotely - I have no physical access to boards.

    Read the article

  • Php efficiency question --> Database call vs. File Write vs. Calling C++ executable

    - by JP19
    Hi, What I wish to achieve is - log all information about each and every visit to every page ofmy website (like ip address, browser, referring page, etc). Now this is easy to do. What I am interested is doing this in a way so as to cause minimum overhead (runtime) in the php scripts. What is the best approach for this efficiency-wise: 1) Log all information to a database table 2) Write to a file (from php directly) 3) Call a C++ executable, that will write this info to a file in parallel [so the script can continue execution without waiting for the file write to occur ...... is this even possible] I may be trying to optimize unnecessarily/prematurely, but still - any thoughts / ideas on this would be appreciated. (I think efficiency of file write/logging can really be a concern if I have say 100 visits per minute...) Thanks & Regards, JP

    Read the article

  • Data base preference for network based C# windows application [on hold]

    - by Sinoop Joy
    I'm planning to develop a C# widows based application for an academy. The academy will have different instances of application running in different machines. The database should have shared access. All the application instances can do update, delete or insert. I've not done any network based application. Anybody can give any useful link to where to start with ? Which database would give max performance with all required features i said for this scenario ?

    Read the article

  • xinet vs iptables for port forwarding performance

    - by jamie.mccrindle
    I have a requirement to run a Java based web server on port 80. The options are: Web proxy (apache, nginx etc.) xinet iptables setuid The baseline would be running the app using setuid but I'd prefer not to for security reasons. Apache is too slow and nginx doesn't support keep-alives so new connections are made for every proxied request. xinet is easy to set up but creates a new process for every request which I've seen cause problems in a high performance environment. The last option is port forwarding with iptables but I have no experience of how fast it is. Of course, the ideal solution would be to do this on a dedicated hardware firewall / load balancer but that's not an option at present.

    Read the article

  • ASA Slow IPSec Performance

    - by Brent
    I have a IPSec link between two sites over ASA 5520s running 8.4(3) and I am getting extremly poor performance when traffic passes over the VPN. CPU on the device is 13%, Memory at 408 MB, and active VPN sessions 2 so the load on the device is particularly low. Screenshot of wireshark file transfer between the two hosts over the VPN: The large amount of Header checksum failures is alarming, but I am not sure what to check now. I perf is showing around 4-5 Mbit/sec with differing TCP window sizes. Show Run on the ASA http://pastebin.com/uKM4Jh76 Show cry accelerator stats http://pastebin.com/xQahnqK3

    Read the article

  • Storing hierarchical template into a database

    - by pduersteler
    If this title is ambiguous, feel free to change it, I don't know how to put this in a one-liner. Example: Let's assume you have a html template which contains some custom tags, like <text_field />. We now create a page based on a template containing more of those custom tags. When a user wants to edit the page, he sees a text field. he can input things and save it. This looks fairly easy to set up. You either have something like a template_positions table which stores the content of those fields. Case: I now have a bit of a blockade keeping things as simple as possible. Assume you have the same tag given in your example, and additionally, <layout> and <repeat> tags. Here's an example how they should be used: <repeat> <layout name="image-left"> <image /> <text_field /> </layout> <layout name="image-right"> <text_field /> <image /> </layout> </repeat> We now have a block which can be repeated, obviously. This means: when creting/editing a page containing such a template block, I can choose between a layout image-left and image-right which then gets inserted as content element (where content for <image /> and <text_field /> gets stored). And because this is inside a <repeat>, content elements from the given layouts can be inserted multiple times. How do you store this? Simply said, this could be stored with the same setup I've wrote in the example above, I just need to add a parent_id or something similiar to maintain a hierarchy. but I think I am missing something. At least the relation between an inserted content element and the origin/insertion point is missing. And what happens when I update the template file? Do I have to give every custom tag that acts as editable part of a template an identifier that matches an identifier in the template to substitue them correctly? Or can you think of a clean solution that might be better?

    Read the article

  • Connect to a MySQL database and count the number of rows.

    - by Hugo
    Hi there! I need to connect to a MySQL database and then show the number of rows. This is what I've got so far; <?php include "connect.php"; db_connect(); $result = mysql_query("SELECT * FROM hacker"); $num_rows = mysql_num_rows($result); echo $num_rows; ?> When I use that code I end up with this error; Warning: mysql_num_rows(): supplied argument is not a valid MySQL result resource in C:\Documents and Settings\username\Desktop\xammp\htdocs\news2\results.php on line 10 Thanks in advance :D

    Read the article

  • How do you design a database to allow fast multicolumn searching?

    - by Fletcher Moore
    I am creating a real estate search from RETS data, but this is a general question. When you have a variety of columns that you would like the user to be able to filter their search result by, how do you optimize this? For example, http://www.charlestonrealestateguide.com/listings.php has 16 or so optional filters. Granted, he only has up to 11,000 entries (I have the same data), but I don't imagine the search is performed with just a giant WHERE AND AND AND ... clause. Or is this typically accomplished with one giant multicolumn index? Newegg, Amazon, and countless others also have cool & fast filtering systems for large amounts of data. How do they do it? And is there a database optimization reason for the tendency to provide ranges instead of empty inputs, or is that merely for user convenience?

    Read the article

  • Database table design vs. ease of use.

    - by Gastoni
    I have a table with 3 fields: color, fruit, date. I can pick 1 fruit and 1 color, but I can do this only once each day. examples: red, apple, monday red, mango, monday blue, apple, monday blue, mango, monday red, apple, tuesday The two ways in which I could build the table are: 1.- To have color, fruit and date be a composite primary key (PK). This makes it easy to insert data into the table because all the validation needed is done by the database. PK color PK fruit PK date 2.- Have and id column set as PK and then all the other fields. Many say thats the way it should be, because composite PKs are evil. For example, CakePHP does no support them. PK id color fruit date Both have advantages. Which would be the 'better' approach?

    Read the article

  • Which JavaScript graphics library has the best performance?

    - by DNS
    I'm doing some research for a JavaScript project where the performance of drawing simple primitives (i.e. lines) is by far the top priority. The answers to this question provide a great list of JS graphics libraries. While I realize that the choice of browser has a greater impact than the library, I'd like to know whether there are any differences between them, before choosing one. Has anyone done a performance comparison between any of these?

    Read the article

  • How do you demonstrate performance in paired-programming environments?

    - by NT3RP
    Performance reviews have come up recently at my work, and I was put in an interesting position. Our team does a lot of pair programming, which has a tendency of averaging out the skill differences between team members (especially considering we rotate pairs). Generally, when doing performance reviews, you look back at the work you've done, and demonstrate what you've accomplished, and how you've exceeded expectations to try to negotiate a raise or other benefits. How do you demonstrate (or even measure) individual performance in an environment like this?

    Read the article

  • server performance metrics report and practicality

    - by Anjesh
    I have a need of preparing web server (apache-php) performance report containing important metrics like CPU usage, disk io, memory usage on user basis. Couple of domains are hosted in the same server and they run from separate users using fcgi. The reason being sometimes some hosted applications take lots of cpu usage, making the server slow for other applications (running as separate users). i am planning to develop scripts for this, as i can't seem to find any simple utilities for this purpose. This script will take snapshots of the user wise metrics at defined periods say 15 minutes and record it. Any abnormalities will be reported via emails. How practical is that? also would be interesting to know what else need to be recorded.

    Read the article

  • Data architecture for event log metrics?

    - by elliot42
    My service has a large ongoing number of user events, and we would like to do things like "count occurrence of event type T since date D." We are trying to make two basic decisions: What to store? Storing every event vs. only storing aggregates (Event log style) log every event and count them later, vs. (Time-series style) store a single aggregated "count of event E for date D" for every day Where to store the data In a relational database (particularly MySQL) In a non-relational (NoSQL) database In flat log files (collected centrally over the network via syslog-ng) What is standard practice / where can I read more about comparing the different types of systems? Additional details: The total event stream is large, potentially hundreds of thousands of entries per day But our current need is only to count certain types of events within it We don't necessarily need real-time access to the raw data or aggregation results IMHO, "log all events to files, crawl them at a later time to filter and aggregate the stream" is a pretty standard UNIX Way, but my Rails-y compatriots seem to think that nothing is real unless it's in MySQL.

    Read the article

  • Best database setup for one click games

    - by ewizard
    I am building a one click game website/mobile app, and I am debating between using MySQL and MongoDB for the backend. The way I have been exploring it is with a NodeJS/Express/Angular/Passport/MongoDB stack - I have also implemented Socket.io. I have gotten to the point where I am sending data from the flash game to the server (NodeJS). The only data that needs to be sent is basic user information, the players score at the end of each game, and some x,y positions for each players game (for anti-cheating). It seems like MySQL would work fine, but as I am already using MongoDB - are there any major drawbacks to continuing to work with MongoDB on this project?

    Read the article

  • Recovering SQL Server 2008 Database From Error 2008

    MS SQL Server 2008 is the latest version of SQL Sever. It has been designed with the SQL Server Always On technologies that minimize the downtime and maintain appropriate levels of application availa... [Author: Mark Willium - Computers and Internet - May 13, 2010]

    Read the article

  • Programming language for simple program?

    - by jamherst
    I am wondering about which programming languages people see fit to create a program idea that I had. I am looking to create a fairly simple program whose main functions are adding to, managing, and searching through a database of people, all through a polished GUI. It will be for use in the business world, so I think Windows would be the priority, but Mac and Linux support wouldn't be bad. Also, eventually I would like to add the ability for an instance of one program on a computer to interact with other instances on the same network, mainly through the sharing of a database. Most of my experience is in Java, but I don't particularly like the appearance of Java GUIs, so I'm looking for an alternative. I noticed that a lot of people have suggested C++ or C# in similar posts, so what are some of the advantages/disadvantages of one or both if that is your suggestion. Thanks for any help in advance.

    Read the article

  • Cost of Web Server that hosted and delivered text only

    - by slandau
    We are developing an application that needs a web server to interact with the two (or more) entities involved. They will not ever see anything on the web, but the server is required for the transfer of data between them. It's sort of a holding point. Now, the only thing the server is going to be holding is textual data. The two entities are going to be doing the work with the data. I was wondering what the cost of this type of server would be. Since it would be JUST a database with no front end, would it make sense to employ a service through Amazon or Google that just holds data for me to access instead of buying a server and making my own database? The amount of data can grow very large however it's only text, and all data over a day old will be deleted for the most part every day. Thanks!

    Read the article

  • how to copy database files from the network access server to Client PC in c#.net?

    - by zoya
    im using a code to copy the files from the database of server PC. so im accessing that server PC through IP address but it is giving me error and not copying the files in the folder of my PC (client PC) this is my code that im using...can u tell me where im wrong?? the file path is given on my listview in winform.. public string RecordingFileCopy(string recordpath,string ipadd) { string strFinalPath; strFinalPath = String.Format("\\{0}'{1}'",ipadd,recordpath); return strFinalPath; } on button click event.... { try { foreach (ListViewItem item in listView1.Items) { string sourceFile = item.SubItems[5].Text; RecordingFileCopy(sourceFile,"10.0.4.123"); File.Copy(sourceFile, Path.Combine(@"E:\name\MyDir", Path.GetFileName(sourceFile))); } } catch { MessageBox.Show("Files are not copied to folder", _strMsg, MessageBoxButtons.OK, MessageBoxIcon.Error); } }

    Read the article

  • Wireless performance on Ubuntu 9.10

    - by Brian
    Is there something I should do to my networking configuration in Ubuntu to better the performance of my wireless connection? I'm on a netbook dual-booting Windows 7 and Ubuntu 9.10. I pick up much stronger wifi signal when in Windows than Ubuntu. As soon as I boot Ubuntu, it will connect to the network with a stronger signal, and then loses signal very quickly. After it dies, I can't reconnect. I've tested this on a couple of different networks with the same outcome.

    Read the article

  • Oracle Exadata???????????????????

    - by takashi.hitomi
    2010?6????????????????????Oracle Exadata??????????????! ???????Oracle Exadata?????? ?????????2010 ???????????????????????????????????????????????????? ??????????????????????????????????????????????????????????????????????????????????? Oracle?Smart Grid????????Oracle Exadata??????????????????????????? ?21? ??·?????????? ????????????????? ????????????????????????????????????????????????????????????·?·????????????????????????????????????????????? ?????????????????????????????????????????????????????? ????????Oracle Exadata?????????????Oracle Database????????·?????????????? ?????????Update?Get?? Oracle Cloud Computing Summit ~ Database & Exadata Day ~ Oracle Cloud Computing Summit??????1? ????·??????????Oracle????????????????????????? ?????????·?????????????????????????????????·??????????????????????????????????????

    Read the article

  • Receive anonymous users' input by web upload form or email. Any online service for that?

    - by sja
    Are you aware of any online service or online "platform" allowing users, not previously registered, to upload pairs of picture+comment to a database? It would be a collaborative database of picture+comment pairs. I'm not going wiki or googlegroup, picasa or such because I'd like the user to have the least to do to participate, that is e.g.: take a picture with his phone and email it to an email to an email address. Or go to a web page with an upload form, type in a description, hit OK and that's it. And the goal is also that it be as hassle-less to put up as possible. Yeah I know, it can't programme itself to my requirements :) by I'm suspecting there's a tool or tool combination going a decent way through my needs. Thanks for any info/advice! SJA (NB the final goal is a kind of crowd-sourced census of specific urban items. If you have comment about the potential for spam-overload of my idea, other than "you're doomed", you're welcome!)

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >