Search Results

Search found 5021 results on 201 pages for 'limit'.

Page 134/201 | < Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >

  • SharePoint SLK and T-SQL xp_cmdshell safety

    - by Mitchell Skurnik
    I am looking into a TSQL command called "xp_cmdshell" to use to monitor a change to a the SLK (SharePoint Learning Kit) database and then execute a batch or PowerShell script that will trigger some events that I need. (It is bad practice to modify SharePoint's database directly, so I will be using its API) I have been reading on various blogs and MSDN that there are some security concerns with this approach. The sites suggest that you limit security so the command can be executed by only a specific user role. What other tips/suggestions would you recommend with using "xp_cmdshell"? Or should I go about this another way and create a script or console application that constantly checks if a change has been made? I am running Server 2008 with SQL 2008.

    Read the article

  • How to declare a variable that spans multiple lines

    - by Chris Wilson
    I'm attempting to initialise a string variable in C++, and the value is so long that it's going to exceed the 80 character per line limit I'm working to, so I'd like to split it to the next line, but I'm not sure how to do that. I know that when splitting the contents of a stream across multiple lines, the syntax goes like cout << "This is a string" << "This is another string"; Is there an equivalent for variable assignment, or do I have to declare multiple variables and concatenate them? Edit: I misspoke when I wrote the initial question. When I say 'next line', I'm just meaning the next line of the script. When it is printed upon execution, I would like it to be on the same line.

    Read the article

  • SQL Server 2008 EF 4 - limiting database records returned using permissions?

    - by Chuck
    In our database all tables are linked back to a single table. This table has a bit column to limit whether the record is displayed. Currently the records are filtered on the code side of the website. Is there any way to set up permission so that userA would never see any record in the database where that common bit column was set to true? We are using SQL Server 2008. Alternatively we are also using entity framework 4.0 in .net 4 (in c#) if you have any ideas how it might be accomplished there? Thanks for your feedback.

    Read the article

  • Python breaks for a certain amount

    - by Brian Cox
    All, I am not very good at explaining so i will let my comments do it! #this script is to calculate some of the times table up to 24X24 and also miss some out #Range of numbers to be calculated numbers=range(1,25) for i in numbers: for w in numbers: print(str(i)+"X"+str(w)+"="+str(i*w)) #here i want to break randomly (skip some out) e.g. i could be doing the 12X1,12X2 and then 12X5 i have no limit of skips. Update Sorry if this is not clear i want it to break from the inner loop for a random amount of times

    Read the article

  • Math Mod Containing Numbers

    - by TheDarkIn1978
    i would like to write a simple line of code, without resorting to if statements, that would evaluate whether a number is within a certain range. i can evaluate from 0 - Max by using the modulus. 30 % 90 = 30 //great however, if the test number is greater than the maximum, using modulus will simply start it at 0 for the remaining, where as i would like to limit it to the maximum if it's past the maximum 94 % 90 = 4 //i would like answer to be 90 it becomes even more complicated, to me anyway, if i introduce a minimum for the range. for example: minimum = 10 maximum = 90 therefore, any number i evaluate should be either within range, or the minimum value if it's below range and the maximum value if it's above range -76 should be 10 2 should be 10 30 should be 30 89 should be 89 98 should be 90 23553 should be 90 is it possible to evaluate this with one line of code without using if statements?

    Read the article

  • Adding a Third Table to a Two-Table Join Query

    - by John
    Hello, The query below works just fine. It pulls fields from two MySQL tables, "comment" and "login". It does this for rows where "username" in the table "login" equals the variable "$profile." It also pulls fields for rows where "loginid" in the table "comment" equals the "loginid" that is also being pulled from "login." I would like to pull data from a third table called "submission," which has the following fields: submissionid loginid title url displayurl datesubmitted I would like to pull fields from rows in "submission" where "loginid" equals the "loginid" that is already being pulled from the other two tables, "login" and "comment." How can I do this? Thanks in advance, John Query: $sqlStrc = "SELECT l.username, l.loginid, c.loginid, c.commentid, c.submissionid, c.comment, c.datecommented FROM comment AS c INNER JOIN login AS l ON c.loginid = l.loginid WHERE l.username = '$profile' ORDER BY c.datecommented DESC LIMIT 10";

    Read the article

  • Codesample with bufferoverflow (gets method). Why does it not behave as expected?

    - by citronas
    This an extract from an c program that should demonstrate a bufferoverflow. void foo() { char arr[8]; printf(" enter bla bla bla"); gets(arr); printf(" you entered %s\n", arr); } The question was "How many input chars can a user maximal enter without a creating a buffer overflow" My initial answer was 8, because the char-array is 8 bytes long. Although I was pretty certain my answer was correct, I tried a higher amount of chars, and found that the limit of chars that I can enter, before I get a segmentation fault is 11. (Im running this on A VirtualBox Ubuntu) So my question is: Why is it possible to enter 11 chars into that 8 byte array?

    Read the article

  • MySQL command-line tool: How to find out number of rows affected by a DELETE?

    - by ambivalence
    I'm trying to run a script that deletes a bunch of rows in a MySQL (innodb) table in batches, by executing the following in a loop: mysql --user=MyUser --password=MyPassword MyDatabase < SQL_FILE where SQL_FILE contains a DELETE FROM ... LIMIT X command. I need to keep running this loop until there's no more matching rows. But unlike running in the mysql shell, the above command does not return the number of rows affected. I've tried -v and -t but neither works. How can I find out how many rows the batch script affected? Thanks!

    Read the article

  • How to ORDER BY non-column field?

    - by Phil Bolduc
    I am trying to create an Entity SQL that is a union of two sub-queries. (SELECT VALUE DISTINCT ROW(e.ColumnA, e.ColumnB, 1 AS Rank) FROM Context.Entity AS E WHERE ...) UNION ALL (SELECT VALUE DISTINCT ROW(e.ColumnA, e.ColumnB, 2 AS Rank) FROM Context.Entity AS E WHERE ...) ORDER BY *??* LIMIT 50 I have tried: ORDER BY Rank and ORDER BY e.Rank but I keep getting: System.Data.EntitySqlException: The query syntax is not valid. Near keyword 'ORDER' I do not think it is a problem with the Rank column. I do think it is how I am trying to apply an order by to two different esql statements joined by union all. Could someone suggest: How to apply a ORDER BY to this kind of UNION/UNION ALL statment How to order by the non-entity column expression. Thanks.

    Read the article

  • How to generate .json file with PHP?

    - by Srinivas Tamada
    CREATE TABLE Posts { id INT PRIMARY KEY AUTO_INCREMENT, title VARCHAR(200), url VARCHAR(200) } json.php code <?php $sql=mysql_query("select * from Posts limit 20"); echo '{"posts": ['; while($row=mysql_fetch_array($sql)) { $title=$row['title']; $url=$row['url']; echo ' { "title":"'.$title.'", "url":"'.$url.'" },'; } echo ']}'; ?> I have to generate results.json file.

    Read the article

  • Returning more than 1000 rows in classic asp adodb.recordset

    - by peg_leg
    My code in asp classic, doing a mssql database query: rs.pagesize = 1000 ' this should enable paging rs.maxrecords = 0 ' 0 = unlimited maxrecords response.write "hello world 1<br>" rs.open strSql, conn response.write "hello world 2<br>" My output when there are fewer than 1000 rows returned is good. More than 1000 rows and I don't get the "hello world 2". I thought that setting pagesize sets up paging and thus allows all rows to be returned regardless of how many rows there are. Without setting pagesize, paging is not enable and the limit is 1000 rows. However my page is acting as if pagesize is not working at all. Please advise.

    Read the article

  • Walking through an SQLite Table

    - by galford13x
    I would like to implement or use functionality that allows stepping through a Table in SQLite. If I have a Table Products that has 100k rows, I would like to retrive perhaps 10k rows at a time. Somthing similar to how a webpage would list data and have a < Previous .. Next > link to walk through the data. Are there select statements that can make this simple? I see and have tried using the ROWID in conjunction with LIMIT which seems ok if not ordering the data. // This seems works if not ordering. SELECT * FROM Products WHERE ROWID BETWEEN x AND y;

    Read the article

  • problem in decreasing page's queries

    - by Mac Taylor
    hey guys i have a tag table in my php/mysql project that looks like this Table name : bt_tags Table fileds : tid,tag and for every story rows there is a filed named : tags Table name: stories table filed : tags that saved in this field as ids 1 5 6 space between them now problem : when using while loop to fetch all fields in story table , the page uses 1 query to show every stories' detail but for showing tag's names , i should query another table to find names , we have ids stored in story table now i used for loop between while loop to show tag names but im sure there is a better way to decrease page queries how can i improve this script and show tag's names without using *for loop ?* $result = $db->sql_query("SELECT * FROM ".STORY_TABLE." "); while ($row = $db->sql_fetchrow($result)) { //fetching other $vars ---- $tags_id = explode(" ",$row['tags']); $c = count($tags_id); for($i=1;$i<$c-1;$i++){ list($tag_name,$slug) = $db->sql_fetchrow($db->sql_query( 'SELECT `tag`,`slug` FROM `bt_tags` WHERE `tid` = "'.tags_id[$i].'" LIMIT 1' )); $sow_tags = '$tag_name,'; }

    Read the article

  • QT4, paginated showing elements

    - by matiit
    I am going to write an application that uses QT4 (with C++ or python it isnt important in that moment). One of functionality is "Showing all items in database". One item has a Title, author, description and photo (constant size) And there could be very many items. Let's say 400. There won't be enough space to show'em all at once time. One row will have 200px, so i need at most 4 for once time. How to paginate them? I have no idea. I can use limit and offset in SQL queries, but how to tell window: "that's 5th page"? Any solutions?

    Read the article

  • Which type of Rails model association should I use in this situation?

    - by jstayton
    I have two models/tables in my Rails application: discussions and comments. Each discussion has_many comments, and each comment belongs_to a discussion. My discussions table also includes a first_comment_id column and last_comment_id column for convenience and speed. I want to be able to call discussion.last_comment for the last comment model, but the following (in my discussion model) isn't working to make this happen: has_one :first_comment, :class_name => "Comment" has_one :last_comment, :class_name => "Comment" When I call discussion.last_comment, the following SQL is run: SELECT * FROM `comments` WHERE (`comments`.discussion_id = 1) LIMIT 1 It's using the discussions.id column to join against comments.discussion_id, when I want it to join discussions.last_comment_id against comments.id. Am I using the wrong type of association here? Thanks for your help!

    Read the article

  • How to find/display the Upload File limits on IIS with ASP.NET?

    - by NVRAM
    I have a web service on which the end users will be uploading ZIP archives that can be very large (one test file is over 200MB). I'd like to handle oversized files proactively and size-limited upload failures gracefully. Since the web app will be deployed on customers' machines, so I cannot easily ensure that the configuration matches any fixed size. I've documented how to use the appcmd command for them to set the requestLimits.maxAllowedContentLength value beyond the 30MB default. But I'd like to handle it in the web app; I'm hoping for two things: To show the current limit on the page where they initiate the file upload, something along the lines of: Each file upload is limited to 15MB. If your archive is larger, (etc., etc., etc.) To give a meaningful error when that size is exceeded. Currently, it takes a long time for the data to be sent, and then I see a misleading 404 page. Any thoughts?

    Read the article

  • Fixing too long comment lines in Vim

    - by Tomek Kaftal
    I'm looking for a convenient way to fix comments where line lengths exceed a certain number of characters in Vim. I'm fine with doing this manually with code, especially since it's not that frequent, plus refactoring long lines is often language, or even code-style dependent, but with comments this is pure drudgery. What happens is I often spot some issue in a comment, tweak one or two words and the line spills out of the, say, 80 character limit. I move the last word to the next line and then the next line spills, and so on. Does anyone know a way to do this automatically in Vim?

    Read the article

  • Best place to store large amounts of session data

    - by audiopleb
    I'm building an application that needs to store and re-use large amounts of data per session. So for example, the user selects a large list of list items (say 2000 or significantly more) which have a numeric value as their key then they save that selection and go off to another page, do something else and then come back to the original page and need to load their selections into that page. What is the quickest and most efficient way of storing and reusing that data? In a text file saved with the session id? In a temp db table? In the session data itself (db sessions so size isn't a limit) using a serialised string or using gzcompress or gzencode? Any advice or insight would be great! Thank you!!!!

    Read the article

  • Send data to webserver from C#, what's the most efficient way?

    - by Brian
    I am sending gps coordinates from a windows mobile phone to a webserver using a basic program I wrote in C#. The problem is the data plan on the phone only allows 4 MB per month. I was planning on updating the location every 10 seconds. Currently I am just creating a webrequest every 10 seconds to a php page on the server and the coordinates are passed over in the url, the php page saves them to the database. This generates about 1K of data per request, at this rate I will hit my data limit in less than a day. Is there a more efficient way to do this?

    Read the article

  • How do you automatically refresh part of a page automatically using Javascript or AJAX?

    - by Ryan
    $messages = $db->query("SELECT * FROM chatmessages ORDER BY datetime DESC, displayorderid DESC LIMIT 0,10"); while($message = $db->fetch_array($messages)) { $oldmessages[] = $message['message']; } $oldmessages = array_reverse($oldmessages); ?> <div id="chat"> <?php for ($count = 0; $count < 9; $count++) { echo $oldmessages[$count]; } ?> <script language="javascript" type="text/javascript"> <!-- setInterval( "document.getElementById('chat').innerHTML='<NEW CONTENT OF #CHAT>'", 1000 ); --> </script> </div> I'm trying to create a PHP chatroom script but I'm having a lot of trouble getting it to AutoRefresh The content should automatically update to , how do you make it do that? I've been searching for almost an hour

    Read the article

  • How to query range of data in DB2 with highest performance?

    - by Fuangwith S.
    Usually, I need to retrieve data from a table in some range; for example, a separate page for each search result. In MySQL I use LIMIT keyword but in DB2 I don't know. Now I use this query for retrieve range of data. SELECT * FROM( SELECT SMALLINT(RANK() OVER(ORDER BY NAME DESC)) AS RUNNING_NO , DATA_KEY_VALUE , SHOW_PRIORITY FROM EMPLOYEE WHERE NAME LIKE 'DEL%' ORDER BY NAME DESC FETCH FIRST 20 ROWS ONLY ) AS TMP ORDER BY TMP.RUNNING_NO ASC FETCH FIRST 10 ROWS ONLY but I know it's bad style. So, how to query for highest performance?

    Read the article

  • Migrating from mssql to firebird: pro and cons

    - by user193655
    i am considering the migration for 3 reasons: 1) SQLSERVER installation is a nightmar, expecially for 1-user software. Software installs in 10 seconds, SQLServer in 1 hour. Firebird installation is much easier. 2) SQLSERVER runs on windows server only 3) My customers have all the express edition 4) i am not using any advanced feature, I am now starting using filestream, but the main reason for this is that Express eidtion has 4/10GB db size limit So these are all Pros of moving to Firebird. Which are the cons? I can also plan to support both platiforms, but this will backfire I fear.

    Read the article

  • Do bit operations cause programs to run slower?

    - by flashnik
    I'm dealing with a problem which needs to work with a lot of data. Currently its values are represented as an unsigned int. I know that real values do not exceed a limit of 1000. Questions I can use unsigned short to store it. An upside to this is that it'll use less storage space to store the value. Will performance suffer? If I decided to store data as short but all the calling functions use int, it's recognized that I need to convert between these datatypes when storing or extracting values. Will performance suffer? Will the loss in performance be dramatic? If I decided to not use short but just 10 bits packed into an array of unsigned int. What will happen in this case comparing with previous ones?

    Read the article

  • Does Postgresql varchar count using unicode character length or ASCII character length?

    - by bennylope
    I tried importing a database dump from a SQL file and the insert failed when inserting the string Mér into a field defined as varying(3). I didn't capture the exact error, but it pointed to that specific value with the constraint of varying(3). Given that I considered this unimportant to what I was doing at the time, I just changed the value to Mer, it worked, and I moved on. Is a varying field with its limit taking into account length of the byte string? What really boggles my mind is that this was dumped from another PostgreSQL database. So it doesn't make sense how a constraint could allow the value to be written initially.

    Read the article

  • If my application doesn't use a lot of memory, can I ignore viewDidUnload:?

    - by iPhoneToucher
    My iPhone app generally uses under 5MB of living memory and even in the most extreme conditions stays under 8MB. The iPhone 2G has 128MB of RAM and from what I've read an app should only expect to have 20-30MB to use. Given that I never expect to get anywhere near the memory limit, do I need to care about memory warnings and setting objects to nil in viewDidUnload:? The only way I see my app getting memory warnings is if something else on the phone is screwing with the memory, in which case the entire phone would be acting silly. I built my app without ever using viewDidUnload:, so there's more than a hundred classes that I'd need to inspect and add code to if I did need to implement it.

    Read the article

< Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >