what's the best way to manage a large amount of documents? (word,ppt,pdf...) when windows explorer is not sufficient? preferably desktop based solution
Hi All,
I will be happy if someone clear doubt, i can see objects in view by
using <%= debug @object % and lot of methods is there apart from view
like
.to_yml, etc
Is there any method available for seeing the converted sql from
ActiveRecord method in view, etc. Although I can find it in console but
it will confuse when we run multiple queries..
example:
User.find :all
it will produce
"SELECT * FROM users;"
in output console
But i want it in view are any other specific point like yml , etc ?
Thanks,
Jak
I need, for mysql to use large-pages, to set a ulimit - I've done this in limits.conf. However, limits.conf (pam_limits.so), doesn't get read in for init, only for "real" shells. I solved this before by adding a "ulimit -l" to the initscript start function. but I need some sort of repeatable way to do this, now that the boxes are managed with chef, and we don't want to take over a file that's actually owned by the RPM.
I have many large Word 2007 documents containing a few dozen equations each. Is there a way to locate the equations using Word's Find command, or do I have to hunt for them old-school?
I tried searching for a graphic (^g) and and field (^d), but that didn't do the trick. Am I missing something obvious? Might there be a way to do this using VB or some other trick?
I'm trying to figure out the optimal size for a large copy from my hard drive using dd. I'm trying to figure out what the best blocksize to use it, which I would assume is the hardware block size for that drive.
I have a dual Xeon hex core machine running an IO intensive application. (WinXP 32) I am seeing a hardware driver (1/2 user mode, 1/2 kernel, streaming data) that is using 6k delta page faults per second. When other applications load or allocate large amounts of memory the driver's hardware buffer gets an underrun (application not feeding it fast enough).
Could this be because the kernel is only using one core to service page fault interrupts?
Rationale:
I want to manage libraries of media files (music, images) using git, there is git-annex but it requires haskel platform - but they do not play together well (also it's quite to big dependency for me).
Question:
Is there any other plugin with this functionality, or possibly would it be possible to write such plugin (resources?).
Similar questions:
Self-hosted, cross-platform repository for large files
Using Git to Manage An iTunes Library?
I am looking to move a file share (100 GB or so) from one domain/server to a new domain and server. I would like to do this with little to no downtime and if possible I would like to be able to map permissions from the groups/users in the current system to groups/users in the new domain.
A side question, a large number of the files in the system are office documents with hard links to the old file server. Any way to programmatically change all those links to the new file server?
Sometimes we need to view large files - 30M-100M.
Usually we use FAR viewer for this. Sometimes we need to copy to clipboard long traces from this file. But it is possible to copy only one screen in FAR viewer.
What can be used for this purpose?
It should be GUI and freeware.
UPDATE:
We need to have ability navigate over the file and see updates of the file in the meantime (eg tail -f)
I keep getting this error lately when I try to copy large (200+ MB) files over to my external. Following this, the disk becomes unresponsive and I got to unplug it and plug it in again to work.
The copy process also is unreasonably slow.
It is worth noting that this happens on Windows too, so it's not the notorious "Error -36" bug OS X had prior to 10.6.3.
The disk is a Western Digital 3200BMV.
Any ideas?
What effect, if any, does the number of icons on a user's Windows XP desktop have on system performance? Can a large number of desktop icons slow down a system?
I have a zip of a pretty large website. I FTP the zip over to the server and then unzip it and extract to the website folder but this is very slow.
Is there anyway to just extract and copy the files that are newer (compared to all files)
I recall once stumbling on a program that could take multiple application windows and wrap them inside a large window with a tabbed interface. One use of this, for example, would be to wrap multiple instances of Excel into one window, and thus icon on the taskbar.
I couldn't find mention of this program via Google, because of the multiple meanings of the word "window". Does anyone remember, or know of, such a program?
A number of texts signify that the most important aspects offered by a DBMS are availability, integrity and secrecy. As part of a homework assignment I have been tasked with mentioning attacks which would affect each aspect. This is what I have come up with - are they any good?
Availability - DDOS attack Integrity
Secrecy - SQL Injection attack
Integrity - Use of trojans to gain access to objects with higher security roles
My outlook fails when trying to download any message/attachment which is larger than 3Mb and will not download any more emails until I go into my internet email account and delete that message. It is then fine until another large email tries to download to Outlook.
This is happening on an XP machine with Outlook 2000 and a Windows 7 machine with Outlook 2003. I am running McAfee on both machines
I am running ubuntu on various computers on a home wireless network. Some are on 9.04x64, some 10.04x64 and one 9.04x32.
Running scp with a large file starts out at 2.1 mbps and drops down to about 200k, stalling and dropping until the transfer is complete. I've noticed this when I have a secure shell open on any of these servers as well.
I have tried this with 2 different routers, both brand new, different brands.
I'm looking for a way to set a group of File Types to "Index Properties and File Contents" (Control Panel, Indexing Options, Advanced Options, File Types).
Basically I'd like to write a batch file that switches that setting for a large group of file types and be able to share it with my entire team. Clicking in the UI is time consuming for everyone.
This is a great solution for bringing up the GUI, but I'd like to create a batch file
What is the command line for Indexing Options?
I am adding a field to a SQL Server 2008 database. The field needs to accomodate up to 8000 characters (occasionally Unicode chars as well). Seems from my research that text data type is being deprecated. Go with nvarchar or stick with text anyway for such a large field?
I love using Dropbox to sync files between all my machines, and I've heard it uses rsync internally to keep files synced.
Sometimes I need to sync very large things, and I don't necessarily want to pay for storage space on someone else's server when I have my own. So does anyone know of any nice cross-platform (pref. open source) automatic file-sync applications out there for this?
sidenote: Here is a Dropbox referral link, if you're feeling generous.
I have a large Exchange server with many hundreds of thousands of emails in thousands of folders.
I would like to generate a list of how many emails have been sent, by user, for a subset of the public folders.
If I could run SQL against the server (can I?), I would like to run a query along the lines of:
SELECT from, count(*)
FROM emails
WHERE email_is_in_folder_or_descendents('Public Folders/Customers/XYZ')
GROUP BY from
Is this possible?
I have full administrator access to the server.
I have a program that automatically populates a word document. Let's say the word document has 3 sections: A, B, and C. My program populates section B. Section B can be 1 line or 100 lines, so what I want to be able to do is make section C automatically start on a fresh page.
section B
<newPageHack> <-- Can I add anything here to force section C onto a new page no matter how large section B is?
section C (on a new page)
I have about 50 small videos (and a few large videos). I want to convert them all with the SAME settings. Its basically change audio to X with Y bitrate, change video to xvid. and do full processing on the video and audio. Then force the FPS to 15 since every program i tried (including virtualdub) thinks it 0.3 FPS.
How do i apply all of these settings to all of my files?
I store my music on an external hard drive, and play them with foobar2k. However, the drive letter changes, which usually means I need to rebuild a fairly large playlist every so often.
I'm wondering if there's a way to reserve a drive letter for a specific external device (or type of device) by device ID or volume name, or if I'm better off using a NTFS mount point, and re-mounting the drive to a folder each time.
I'm using either a Windows XP or 7 system, and the external drive is NTFS.