Search Results

Search found 10622 results on 425 pages for 'shared hosting'.

Page 400/425 | < Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >

  • Delaying emails in PHP to avoid exceeding server limit

    - by Andrew P.
    Okay, so here's my problem: I have a list of members on a website, and periodically one of the admins my site (who are not very web or tech savvy) will send a newsletter to the memberlist. My current memberlist is well over 800 individuals long. So, I wrote an email script that sends the email to the full memberlist, with the members listed in the Bcc header. However, I've discovered that my host server has a limit of 300 emails per hour, which I apparently exceed even though the members are listed in the Bcc field. (I wasn't previously aware that the behaviour of Bcc was to send separate emails for each name on the list...) After some thought, I've come to the conclusion that my only solution is to have my script send only the email to only the first 300 emails, wait an hour, and send a second email to the next three hundred, wait another hour, and so on until I've sent the email to the whole member list. Looking around on the internet, I've seen some other solutions people have come up with for delaying emails in PHP. Sleep() is obviously not an option, because I can't just leave the script open and running for 3 or four hours. I've seen some people suggest cron jobs, but I'm not sure how feasible it would be to create three new cron jobs every time I send an email, use them once, and then delete them afterward. The final (and what I think is the smartest) solution I've seen, is to have a table in my database to temporarily store the emails to be delayed and sent later, and then create a cron job that checks this sql table every hour or so, compares the timestamp of the row to the current timestamp, and then sends the email if an hour has passed. So I'm asking you all which method you would recommend. Is there an easier solution that I've completely looked over (aside from getting a different hosting plan. ha!), or is there a cleaner way to do it than the database / cron job approach? tl;dr: I have 800 emails to send in an hour on a server that limits me to 300/hr. Using PHP, find a way to get around this problem in a way that the person sending the email needs only to click "send."

    Read the article

  • MVC Rendered Partial, how to get partial/view model in main model post to controller

    - by user1475788
    I have a text file and when users upload the file, the controller action method parses that file using state machine and uses a generic list to store some values. I pass this back to the view in the form of an IEnumerable. Within my main view, based on this ienumerable list I render a partail view to iterate items and display labels and a textarea. Users could add their input in the text area. When the users hit the save button this ienumrable list from the partial view rendered is null. so please advice any solutions. here is my main view @model RunLog.Domain.Entities.RunLogEntry @{ ViewBag.Title = "Create"; Layout = "~/Views/Shared/_Layout.cshtml"; } @using (Html.BeginForm("Create", "RunLogEntry", FormMethod.Post, new { enctype = "multipart/form-data" })) { <div id="inputTestExceptions" style="display: none;"> <table class="grid" style="width: 450px; margin: 3px 3px 3px 3px;"> <thead> <tr> <th> Exception String </th> <th> Comment </th> </tr> </thead> <tbody> @if (Model.TestExceptions != null) { foreach (var p in Model.TestExceptions) { Html.RenderPartial("RunLogTestExceptionSummary", p); } } </tbody> </table> </div> } partial view as follows: @model RunLog.Domain.Entities.RunLogEntryTestExceptionDisplay <tr> <td> @Model.TestException@ </td> <td>@Html.TextAreaFor(Model.Comment, new { style = "width: 200px; height: 80px;" }) </td> </tr> Controller action [HttpPost] public ActionResult Create(RunLogEntry runLogEntry, String ServiceRequest, string Hour, string Minute, string AMPM, string submit, IEnumerable<HttpPostedFileBase> file, String AssayPerformanceIssues1, IEnumerable<RunLogEntryTestExceptionDisplay> models) { } The problem is test exceptions which contains exception string and comment is comming back null.

    Read the article

  • GCC: Simple inheritance test fails

    - by knight666
    I'm building an open source 2D game engine called YoghurtGum. Right now I'm working on the Android port, using the NDK provided by Google. I was going mad because of the errors I was getting in my application, so I made a simple test program: class Base { public: Base() { } virtual ~Base() { } }; // class Base class Vehicle : virtual public Base { public: Vehicle() : Base() { } ~Vehicle() { } }; // class Vehicle class Car : public Vehicle { public: Car() : Base(), Vehicle() { } ~Car() { } }; // class Car int main(int a_Data, char** argv) { Car* stupid = new Car(); return 0; } Seems easy enough, right? Here's how I compile it, which is the same way I compile the rest of my code: /home/oem/android-ndk-r3/build/prebuilt/linux-x86/arm-eabi-4.4.0/bin/arm-eabi-g++ -g -std=c99 -Wall -Werror -O2 -w -shared -fshort-enums -I ../../YoghurtGum/src/GLES -I ../../YoghurtGum/src -I /home/oem/android-ndk-r3/build/platforms/android-5/arch-arm/usr/include -c src/Inheritance.cpp -o intermediate/Inheritance.o (Line breaks are added for clarity). This compiles fine. But then we get to the linker: /home/oem/android-ndk-r3/build/prebuilt/linux-x86/arm-eabi-4.4.0/bin/arm-eabi-gcc -lstdc++ -Wl, --entry=main, -rpath-link=/system/lib, -rpath-link=/home/oem/android-ndk-r3/build/platforms/android-5/arch-arm/usr/lib, -dynamic-linker=/system/bin/linker, -L/home/oem/android-ndk-r3/build/prebuilt/linux-x86/arm-eabi-4.4.0/lib/gcc/arm-eabi/4.4.0, -L/home/oem/android-ndk-r3/build/platforms/android-5/arch-arm/usr/lib, -rpath=../../YoghurtGum/lib/GLES -nostdlib -lm -lc -lGLESv1_CM -z /home/oem/android-ndk-r3/build/platforms/android-5/arch-arm/usr/lib/crtbegin_dynamic.o /home/oem/android-ndk-r3/build/platforms/android-5/arch-arm/usr/lib/crtend_android.o intermediate/Inheritance.o ../../YoghurtGum/bin/YoghurtGum.a -o bin/Galaxians.android As you can probably tell, there's a lot of cruft in there that isn't really needed. That's because it doesn't work. It fails with the following errors: intermediate/Inheritance.o:(.rodata._ZTI3Car[typeinfo for Car]+0x0): undefined reference to `vtable for __cxxabiv1::__si_class_type_info' intermediate/Inheritance.o:(.rodata._ZTI7Vehicle[typeinfo for Vehicle]+0x0): undefined reference to `vtable for __cxxabiv1::__vmi_class_type_info' intermediate/Inheritance.o:(.rodata._ZTI4Base[typeinfo for Base]+0x0): undefined reference to `vtable for __cxxabiv1::__class_type_info' collect2: ld returned 1 exit status make: *** [bin/Galaxians.android] Fout 1 These are the same errors I get from my actual application. If someone could explain to me where I went wrong in my test or what option or I forgot in my linker, I would be very, extremely grateful. Thanks in advance. UPDATE: When I make my destructors non-inlined, I get new and more exciting link errors: intermediate/Inheritance.o:(.rodata+0x78): undefined reference to `vtable for __cxxabiv1::__si_class_type_info' intermediate/Inheritance.o:(.rodata+0x90): undefined reference to `vtable for __cxxabiv1::__vmi_class_type_info' intermediate/Inheritance.o:(.rodata+0xb0): undefined reference to `vtable for __cxxabiv1::__class_type_info' collect2: ld returned 1 exit status make: *** [bin/Galaxians.android] Fout 1

    Read the article

  • The remote server returned an error: (400) Bad Request

    - by pravakar
    Hi, I am getting the following errors: "The remote server returned an error: (400) Bad Request" "Requested time out" sometimes when connecting to a host using a web service. If the XML returned is 5 kb then it is working fine, but if the size is 450kb or above it is displaying the error. Below is my code as well as the config file that resides at the client system. We don't have access to the settings of the server. Protected Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Dim fileName As String = Server.MapPath("capitaljobs2.xml") Dim client = New CapitalJobsService.DataServiceClient("WSHttpBinding_IDataService", "http://xyz/webservice.svc") Dim userAccount = New UserAccount() 'replace here Dim jobAdList = client.GetProviderJobs(userAccount) '## Needed only to create XML files - do not ucomment - will overwrite files 'if (jobAdList != null) ' SerialiseJobAds(fileName, jobAdList); '## Read new ads from Xml file Dim capitalJobsList = DeserialiseJobdAds(fileName) UpdateProviderJobsFromXml(client, userAccount, capitalJobsList) client.Close() End Sub Private Shared Function DeserialiseJobdAds(ByVal fileName As String) As CapitalJobsService.CapitalJobsList Dim capitalJobsList As CapitalJobsService.CapitalJobsList ' Deserialize the data and read it from the instance If File.Exists(fileName) Then Dim fs = New FileStream(fileName, FileMode.Open) Dim reader = XmlDictionaryReader.CreateTextReader(fs, New XmlDictionaryReaderQuotas()) Dim ser2 = New DataContractSerializer(GetType(CapitalJobsList)) capitalJobsList = DirectCast(ser2.ReadObject(reader, True), CapitalJobsList) reader.Close() fs.Close() Return capitalJobsList End If Return Nothing End Function And the config file <system.web> <httpRuntime maxRequestLength="524288" /> </system.web> <system.serviceModel> <bindings> <wsHttpBinding> <binding name="WSHttpBinding_IDataService" closeTimeout="00:10:00" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:10:00" bypassProxyOnLocal="false" transactionFlow="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647" messageEncoding="Text" textEncoding="utf-8" useDefaultWebProxy="true" allowCookies="false"> <readerQuotas maxDepth="2000000" maxStringContentLength="2000000" maxArrayLength="2000000" maxBytesPerRead="2000000" maxNameTableCharCount="2000000" /> <reliableSession ordered="true" inactivityTimeout="00:10:00" enabled="false" /> <security mode="None"> <transport clientCredentialType="Windows" proxyCredentialType="None" realm=""/> <message clientCredentialType="Windows" negotiateServiceCredential="true" establishSecurityContext="true"/> </security> </binding> </wsHttpBinding> </bindings> <client> <endpoint address="http://xyz/DataService.svc" binding="wsHttpBinding" bindingConfiguration="WSHttpBinding_IDataService" contract="CapitalJobsService.IDataService" name="WSHttpBinding_IDataService"> <identity> <dns value="localhost"/> </identity> </endpoint> </client> </system.serviceModel> I am using "Fiddler" to track the activities it is reading and terminating file like * FIDDLER: RawDisplay truncated at 16384 characters. Right-click to disable truncation. * But in config the number 16348 is not mentioned anywhere. Can you figure out if the error is on client or server side? The settings above are on the client side. Thanks in advance.

    Read the article

  • jqModal and jquery widget long shot

    - by rod
    Hi All, I just started playing around with jquery widgets within my jqmodals in my mvc app. I know this may be a long shot but I'll take it. Initially, I can click the Add link, get the alert ("which is the prize", watching too much tv), next click cancel to close modal and get the desired results. I can, then, click the Edit link and get the same desired results. However, if I click Edit link first then I try to click the Add link, "forget about it" I don't get the alert (which means my widget did not init). But I can still go back and click Edit and get the prize (the alert message). ajax: "/Home/EditPrintAdLine" and ajax: "/Home/AddPrintAdLine" render the same web user control Any ideas? <%@ Page Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage" %> <asp:Content ID="indexTitle" ContentPlaceHolderID="TitleContent" runat="server"> Home Page </asp:Content> <asp:Content ID="indexContent" ContentPlaceHolderID="MainContent" runat="server"> <h2><%= Html.Encode(ViewData["Message"]) %></h2> <p> To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>. </p> <div id="printAdLineEditDialog" class="jqmWindow"></div> <div id="printAdDialog" class="jqmWindow"></div> <table> <tr><td><a id="printAdLineItem" href="#">Add a Line Item</a></td></tr> <tr><td><a id="editPrintAdLine" href="#">Edit</a></td></tr> </table> <script type="text/javascript"> $(document).ready(function() { $.widget("ui.my_widget", { _init: function() { alert("My widget was instantiated"); } }); // Add line $('#printAdLineItem').click(function(e) { $('#printAdDialog').jqmShow(this); e.preventDefault(); }); $('#printAdDialog').jqm({ ajax: "/Home/AddPrintAdLine", onLoad: function(hash) { $('#PrintAdLine_RunDate').my_widget(); } }); // Edit line $('#editPrintAdLine').click(function(e) { $('#printAdLineEditDialog').jqmShow(this); e.preventDefault(); }); $('#printAdLineEditDialog').jqm({ ajax: "/Home/EditPrintAdLine", onLoad: function(hash) { $('#PrintAdLine_RunDate').my_widget(); } }); }); </script> </asp:Content>

    Read the article

  • Moving MVC2 Helpers to MVC3 razor view engine

    - by Dai Bok
    Hi, In my MVC 2 site, I have an html helper, that I use to add javascripts for my pages. In my master page I have the main javascripts I want to include, and then in the aspx pages, I include page specific javascripts. So for example, my Site.Master has something like this: .... <head> <%=html.renderScripts() %> </head> ... //core scripts for main page <%html.AddScript("/scripts/jquery.js") %> <%html.AddScript("/scripts/myLib.js") %> .... Then in the child aspx page, I may also want to include other scripts. ... //the page specific script I want to use <% html.AddScript("/scripts/register.aspx.js") %> ... So when the full page gets rendered the javascript files are all collected and rendered in the head by sitemaster placeholder function RenderScripts. This works fine. Now with MVC 3 and razor view engine, they layout pages behave differently, because now my page level javascripts are not rendered/included. Now all I see the LayoutMaster contents. How do I get the solution wo workwith MVC 3 and the razor view engine. (The helper has already been re-written to return a HTMLString ;-)) For reference: my MasterLayout looks like this: ... ... <head> @{ Html.AddJavaScript("/Scripts/jQuery.js"); Html.AddJavaScript("/Scripts/myLib.js"); } //Render scripts @html.RenderScripts() </head> .... and the child page looks like this: @{ Layout = "~/Views/Shared/MasterLayout.cshtml"; ViewBag.Title = "Child Page"; Html.AddJavaScript("/Scripts/register.aspx.js"); } .... <div>some html </div> Thanks for your help. Edit = Just to explain, if this question is not clear enough. When producing a "page" I collect all the javascript files the designers want to use, by using the html.addJavascript("filename.js") and store these in a dictionary - (1) stops people adding duplicate js files - then finally when the page is ready to render, I write out all the javascript files neatly in the header. (2) - this helper helps keep JS in one place, and prevents designers from adding javascript files all over the place. This used to work fine with Master/SiteMaster Pages in mvc 2. but how can I achieve this with razor?

    Read the article

  • Web-Frameworks for Education Management Systems?

    - by Indebi
    So, I'm working on an idea and I'll go into a brief overview of that but my question is, What are some good web frameworks for this situation? I have some experience in the following languages: C# Python I have considerably more experience in C# than Python, however I am expecting to learn new things. My idea is this, a completely web-based community-oriented Education Management System that focuses on making students and teachers day-to-day lives easier. For students it will provide a centralized place for them to do homework, study for tests, and reinforce concepts learned previously in class. For teachers it will give them a centralized place to handle assignments, attendance, homework, tests, and all other major parts of classroom management. All of that, but in a community-oriented fashion. Everything a teacher does is shared and open to constructive criticism, allowing other teachers to use their assignments/tests and for students or other teachers to comment, rate and criticize their assignments. This encourages an environment of openness that will allow teacher's to focus on teaching and student's to focus on learning. And that community wouldn't be limited to one school or school-district, this system would be completely school-independent. Please note that I have no problem with hearing constructive criticism on this idea, however I would prefer if this post was more focused on my question. I have somewhat explored about the following options: Django ASP.NET Ruby on Rails Silverlight (1) I have Django installed and I played with it for a little bit, I really like how easy setting up databases are and how it handles the database completely for you. I don't really know how to use it very well and I don't quite understand the Model-View-Controller paradigm(?) for it yet but I haven't thought about it much. I also like the fact that it uses Python. (2) I don't really like Visual Studio for developing in ASP.NET, I hate the way the web-designer works and it just feels clunky and old. I like the server-side development part though. I don't like how expensive ASP.NET and overall Visual Studio is, even if I do get it for free for now using DreamSpark (3) I haven't been able to explore much with this, I could not get Rails (or maybe Ruby) properly installed. I first installed it within RadRails and that didn't work so I uninstalled RadRails and then installed the latest version of Ruby off the official Windows Installer and then installed Ruby on Rails through gem and even after all that it still didn't work, so I installed Netbeans and attempted to use it there but it still did not work (4) I like Silverlight in some extents, I've played with this one the most, it's very similar to WPF (which I've used the most) in a lot of ways but I don't like how database connectivity works, at least in comparison to Django. I also dislike how expensive everything with Microsoft is, even if I get it for free for now with DreamSpark. I would like to hear some suggestions from experienced web-developers as to what I should use and why, or at least what some good options are for my scenario Your help would be very appreciated

    Read the article

  • Asp.Net MVC2 Model Binding Problem.

    - by Pino
    Why is my controller recieving an empty model in this case? Using the following, <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<X.Models.ProductModel>" %> <asp:Content ID="Content1" ContentPlaceHolderID="MainContent" runat="server"> <h2>Product</h2> <% using (Html.BeginForm() {%> <%: Html.ValidationSummary(true) %> <div class="editor-label"> Product Name </div> <div class="editor-field"> <%: Html.TextBoxFor(model => model.Name) %> <%: Html.ValidationMessageFor(model => model.Name) %> </div> <br /> <div class="editor-label"> Short Description </div> <div class="editor-field"> <%: Html.TextAreaFor(model => model.ShortDesc) %> <%: Html.ValidationMessageFor(model => model.ShortDesc) %> </div> <br /> <div class="editor-label"> Long Description </div> <div class="editor-field"> <%: Html.TextAreaFor(model => model.LongDesc) %> <%: Html.ValidationMessageFor(model => model.LongDesc) %> </div> <p> <input type="submit" value="Create" /> </p> <% } %> </asp:Content> and the following controller using System.Web.Mvc; using X.Lib.Services; using X.Models; namespace X.Admin.Controllers { public class ProductController : Controller { [HttpGet] public ActionResult ProductData() { return View(); } [HttpPost] public ActionResult ProductData(ProductModel NewProduct) { //Validate and save if(ModelState.IsValid) { //Save And do stuff. var ProductServ = new ProductService(); ProductServ.AddProduct(NewProduct); } return View(); } } }

    Read the article

  • Impersonation on Windows 2000 to Windows XP Leaves Connections Open

    - by Tallek
    I'm running on a Windows 2000 Pro SP4 box (off domain) and trying to impersonate a local user on a Windows XP box (on domain). I'm using code very similar to the WindowsImpersonationContextFacade in the question posted here: http://stackoverflow.com/questions/879704/how-can-i-temporarily-impersonate-a-user-to-open-a-file. I am using impersonation to remotely start and stop windows services as well as access network shares (for some automated integration tests). To get this working, i had to use LOGON32_PROVIDER_DEFAULT and LOGON32_LOGON_NEW_CREDENTIALS when calling LogonUser. Everything worked beautifully ( Windows XP on domain to Windows XP on domain, Windows XP on domain to Windows Server 2003 off domain, and even Windows XP on domain to Windows 2000 off domain). The one issue was running on Windows 2000 Pro SP4 off the domain and trying to impersonate a local user on a Windows XP box running on the domain. To get the Windows 2000 piece working, i had to use LOGON32_PROVIDER_WINNT50 and LOGON32_LOGON_NEW_CREDENTIALS when calling LogonUser. This seemed to get me 95% of the way there, i could now impersonate the local user on the XP box and start/stop services as well as access a network share using the impersonated credentials. I'm running in to one problem though, calling Undo impersonation and closing the token handle seems to leave the connection to the remote box open. After about 10 or so impersonation calls, further impersonation attempts will fail with an error saying something about too many connections are currently open. If i look at the Computer Management - System Tools - Shared Folders - Sessions on my remote Windows XP box, i can see about 10 sessions open to the Windows 2000 box. I can manually close these (i think they may eventually close themselves, but not very quickly) and then impersonation begins working again few more times. This open session issue doesn't seem to be a problem in any of my other test scenarios, just when running locally on a Windows 2000 box. Any ideas? Edit 1: After some more testing and trying out many different things, this seems to be an issue with open sessions not being reused. On Windows 2000 only, every call to LogonUser to get a token and then using that token to impersonate seems to result in a new session being created. I'm guessing Windows XP & Windows Server 2003 are reusing open sessions since i don't seem to be having any issues with them. If I call LogonUser once, then cache the token, I seem to be able to make as many calls to impersonate as I need using the cached token without running in to the "too many connections" issue. This seems like an ugly work around though since i can't call CloseHandle() on my token every time i perform impersonation. Anybody have any thoughts or ideas, or am i stuck with this ugly hack? Thanks

    Read the article

  • problem with custom NSProtocol and caching on iPhone

    - by TomSwift
    My iPhone app embeds a UIWebView which loads html via a custom NSProtocol handler I have registered. My problem is that resources referenced in the returned html, which are also loaded via my custom protocol handler, are cached and never reloaded. In particular, my stylesheet is cached: <link rel="stylesheet" type="text/css" href="./styles.css" /> The initial request to load the html in the UIWebView looks like this: NSString* strUrl = [NSMutableString stringWithFormat: @"myprotocol:///entry?id=%d", entryID ]; NSURL* url = [NSURL URLWithString: strUrl]; [_pCurrentWebView loadRequest: [NSURLRequest requestWithURL: url cachePolicy: NSURLRequestReloadIgnoringLocalCacheData timeoutInterval: 60 ]]; (note the cache policy is set to ignore, and I've verified this cache policy carries through to subsequent requests for page resources on the initial load) The protocol handler loads the html from a database and returns it to the client using code like this: // create the response record NSURLResponse *response = [[NSURLResponse alloc] initWithURL: [request URL] MIMEType: mimeType expectedContentLength: -1 textEncodingName: textEncodingName]; // get a reference to the client so we can hand off the data id client = [self client]; // turn off caching for this response data [client URLProtocol: self didReceiveResponse:response cacheStoragePolicy: NSURLCacheStorageNotAllowed]; // set the data in the response to our jfif data [client URLProtocol: self didLoadData:data]; [data release]; (Note the response cache policy is "not allowed"). Any ideas how I can make it NOT cache my styles.css resource? I need to be able to dynamically alter the content of this resource on subsequent loads of html that references this file. I thought clearing the shared url cache would work, but it doesnt: [[NSURLCache sharedURLCache] removeAllCachedResponses]; One thing that does work, but it's terribly inefficient, is to dynamically cache-bust the url for the stylesheet by adding a timestamp parameter: <link rel="stylesheet" type="text/css" href="./styles.css?ts=1234567890" /> To make this work I have to load my html from the db, search and replace the url for the stylesheet with a cache-busting parameter that changes on each request. I'd rather not do this. My presumption is that there is no problem if I were to load my content via the built-in HTTP protocol. In that case, I'm guessing that the UIWebView looks at any Cache-Control flags in the NSURLHTTPResponse object's http headers and abides by them. Since my NSURLResponseObject has no http headers (it's not http...) then perhaps UIWebView just decides to cached the resource (ignoring the NSURLRequest caching directive?). Ideas???

    Read the article

  • How can I limit the cache used by copying so there is still memory available for other cache?

    - by Peter
    Basic situation: I am copying some NTFS disks in openSuSE. Each one is 2TB. When I do this, the system runs slow. My guesses: I believe it is likely due to caching. Linux decides to discard useful cache (eg. kde4 bloat, virtual machine disks, LibreOffice binaries, Thunderbird binaries, etc.) and instead fill all available memory (24 GB total) with stuff from the copying disks, which will be read only once, then written and never used again. So then any time I use these apps (or kde4), the disk needs to be read again, and reading the bloat off the disk again makes things freeze/hiccup. Due to the cache being gone and the fact that these bloated applications need lots of cache, this makes the system horribly slow. Since it is USB,the disk and disk controller are not the bottleneck, so using ionice does not make it faster. I believe it is the cache rather than just the motherboard going too slow, because if I stop everything copying, it still runs choppy for a while until it recaches everything. And if I restart the copying, it takes a minute before it is choppy again. But also, I can limit it to around 40 MB/s, and it runs faster again (not because it has the right things cached, but because the motherboard busses have lots of extra bandwidth for the system disks). I can fully accept a performance loss from my motherboard's IO capability being completely consumed (which is 100% used, meaning 0% wasted power which makes me happy), but I can't accept that this caching mechanism performs so terribly in this specific use case. # free total used free shared buffers cached Mem: 24731556 24531876 199680 0 8834056 12998916 -/+ buffers/cache: 2698904 22032652 Swap: 4194300 24764 4169536 I also tried the same thing on Ubuntu, which causes a total system hang instead. ;) And to clarify, I am not asking how to leave memory free for the "system", but for "cache". I know that cache memory is automatically given back to the system when needed, but my problem is that it is not reserved for caching of specific things. Question: Is there some way to tell these copy operations to limit memory usage so some important things remain cached, and therefore any slowdowns are a result of normal disk usage and not rereading the same commonly used files? For example, is there a setting of max memory per process/user/file system allowed to be used as cache/buffers?

    Read the article

  • Share a "deep link" from a Windows 8/WinRT application

    - by Dave Parker
    I have searched using many different terms and phrases, and waded through many pages of results, but I have (remarkably) not seen anyone else addressing, even asking, about, this issue. So here goes... Ultimate Goal: Allow a user viewing a content-based page (may contain both text and images) within a Windows Store app to share that content with someone else. Description I am working on taking a fair amount of content and making it available for browsing/navigating as a Windows 8/WinRT/Windows Store (we need a consistent name here) application. One of the desired features is to take advantage of the Share Charm, such that someone viewing a page could share that page with someone else. The ideal behavior is for the application to implement the Share Source contract which would share an email message that contained some explanatory text, a link to get the app from the Windows Store, and a "deep link" into the shared page in the application. Solutions Considered We had originally looked at just generating a PDF representation of the page, but there are very few external libraries that would work under WinRT, and having to include externally licensed code would be problematic as well. Writing our own PDF generation code would out of scope. We have also considered generating a Word document or PowerPoint slide using OpenXML, but again, we run up against the limitaions of WinRT. In this case, it is highly unlikely the OpenXML SDK is useable in a WinRT application. Another thought was to pre-generate all of the pages as .pdf files, store them as resources, and when the Share Charm is invoked, share the .pdf file associated with the current page. The problem here is the application will have at least 150 content pages, and depending on how we break the content down, up to over 600. This would likely cause serious bloat. Where We Are At Thus we have come to sharing URIs. From what I can tell, though, the "deep linking" feature is only intended for use on Secondary Tiles tied to your application. Another avenue I considered was registering a protocol like, "my-special-app:" with the OS and having it fire up the application but that would require HKCR registry access, which is outside the WinRT sandbox. If it matters, we are leaning towards an HTML/JS application, rather than XAML/C#, because the converted content will all be in HTML and the WebView control in WinRT is fairly limited. This decision is not yet final, though. Conclusion So, is this possible, and if so, how would it be done or where can I find documentation on it? Thanks, Dave Parker

    Read the article

  • Buy or Build for web deployment?

    - by Cannonade
    I have been evaluating the wide range of installation and web deployment solutions available for Windows applications. I will just clarify here (without too much detail, these tools have been covered in other questions) my understanding of the options: NSIS - Free tool that generates setup executables. Small binary. Specialized, sometimes obtuse, scripting language. Inno Setup - Free tools for setup executables. Various binary compression schemes. Pascal scripting engine. WIX - Free toolset to generate MSI binaries. XML definitions language. WIX ClickThrough - Additional tools for packaging, web download and auto update detection (now part of WIX core). InstallShield - Commercial development environment for installation packaging. Generates MSI binaries. C-like InstallScript language. Wise - Commercial development environment for installation packaging. Generates MSI binaries. ClickOnce - Visual Studio supported framework for publishing applications to a webserver, with automatic detection of updates. No support for custom installation requirements (INI files, registry etc ...). Packages setup as an MSI binary. Install Aware - Commercial development environment for installation. Generates MSI binaries. Automatic Update framwork (Web Update). If I have missed any, please let me know. And found some useful discussions of these technologies on StackOverflow: Best Simple Install System Best choice for Windows installers Alternatives to ClickOnce I have worked with a few of these solutions, as well as a handful of proprietary internal installation solutions. They are mostly concerned with packing installations and providing a framework for developers to access the run time environment. With the growing requirement for web deployment and automatic software updates, I expected to find more of a consensus among developers on a framework for web delivery of software and subsequent updates, I haven't really found that consensus. There are certainly solutions available (ClickOnce, ClickThrough, InstallShield Update Service), but they each have considerable limitations (please correct me if I mis-represent any of these). I would be interested in a framework that provided some of the following: Third party hosting/management of updates. Access to client environment (INI files, registry, etc..). User registration/activation. Feedback/Error reporting This is leaving me with the strong impression that the best way to approach the web deployment problem is through a custom built proprietary solution (possibly leveraging existing installer packaging). I have seen this sort of solution work well for a number of successful applications: FileZilla - HTTP request to update.filezilla-project.org to check for updates, downloads an NSIS binary (I think) and then shuts down to run the install.

    Read the article

  • What facets have I missed for creating a 3 person guerilla dev team?

    - by Penguinix
    Sorry for the Windows developers out there, this solution is for Macs only. This set of applications accounts for: Usability Testing, Screen Capture (Video and Still), Version Control, Task Lists, Bug Tracking, a Developer IDE, a Web Server, A Blog, Shared Doc Editing on the Web, Team and individual Chat, Email, Databases and Continuous Integration. This does assume your team members provide their own machines, and one person has a spare old computer to be the Source Repository and Web Server. All for under $200 bucks. Usability Silverback Licenses = 3 x $49.95 "Spontaneous, unobtrusive usability testing software for designers and developers." Source Control Server and Clients (multiple options) Subversion = Free Subversion is an open source version control system. Versions (Currently in Beta) = Free Versions provides a pleasant work with Subversion on your Mac. Diffly = Free "Diffly is a tool for exploring Subversion working copies. It shows all files with changes and, clicking on a file, shows a highlighted view of the changes for that file. When you are ready to commit Diffly makes it easy to select the files you want to check-in and assemble a useful commit message." Bug/Feature/Defect Tracking (multiple options) Bugzilla = Free Bugzilla is a "Defect Tracking System" or "Bug-Tracking System". Defect Tracking Systems allow individual or groups of developers to keep track of outstanding bugs in their product effectively. Most commercial defect-tracking software vendors charge enormous licensing fees. Trac = Free Trac is an enhanced wiki and issue tracking system for software development projects. Database Server & Clients MySQL = Free CocoaMySQL = Free Web Server Apache = Free Development and Build Tools XCode = Free CruiseControl = Free CruiseControl is a framework for a continuous build process. It includes, but is not limited to, plugins for email notification, Ant, and various source control tools. A web interface is provided to view the details of the current and previous builds. Collaboration Tools Writeboard = Free Ta-da List = Free Campfire Chat for 4 users = Free WordPress = Free "WordPress is a state-of-the-art publishing platform with a focus on aesthetics, web standards, and usability. WordPress is both free and priceless at the same time." Gmail = Free "Gmail is a new kind of webmail, built on the idea that email can be more intuitive, efficient, and useful." Screen Capture (Video / Still) Jing = Free "The concept of Jing is the always-ready program that instantly captures and shares images and video…from your computer to anywhere." Lots of great responses: TeamCity [Yo|||] Skype [Eric DeLabar] FogBugz [chakrit] IChatAV and Screen Sharing (built-in to OS) [amrox] Google Docs [amrox]

    Read the article

  • Square Peg Web: Gets you the traffic to where it matters most: Your Website!

    - by demetriusalwyn
    Have you decided to start your business online or is your business not reaching the targeted audience? Come to Square Peg Web; where you will find what you want to make your business reach new heights. The team at Square Peg Web is professionals who understand what you want and make sure you get it right. Our confidence stems from the fact of thousands of satisfied clients who keep referring friends and business associates to us and we do not let our clients down. Many companies promise the sky but how far is does their work live up to the promises? We do not know about the others however, we are sure that we strive to put together all our ideas and thoughts to make your website rank among the top. Web hosting is something that needs to have a personal touch; Square Peg Web customizes everything to suit your requirements so that you do not have to look further. With Square Peg Web you have a host of features to make your Business go viral. Some of the product details that are offered with Square Peg Web are unlimited product options/ variants/ properties giving you an option on price modifiers. You get unlimited customized input fields for your products and you can also Customer-define the prices. Square Peg Web provides you an option of using multiple product images with zoom features and one can also list a particular product in several categories. There are other aspects which make Square Peg Web the best choice for your website needs; every sale of yours’ is important to you and to us. We make sure that each sale is tracked by the product and also the list of bestsellers that appeal to the audience. Other comprehensive statistics of Square Peg Web includes searchable order data, an interface for shipments and order fulfillments, export sales & customer data for usage in a spreadsheet and the ability to export orders to QuickBooks format. With Square Peg Web; Admin Panel is a lot simpler. Administrative access is completely password protected and any changes done are all in real-time. You can have absolute control on the cart from anywhere around the world using your web browser and the topping on the cake is the unlimited amount of admin accounts that can be created for you. Square Peg Web offers you a world of experience with the options of choosing from marketing websites to e-commerce and from customized applications to community oriented sites. Some of the projects which appear in the portfolio of Square Peg Web are Online Marketing Web Sites, E-Commerce Web Sites, customized web applications, Blog designing and programming, video sharing and the option of downloading web sites, online advertisements, flash animation, customer and product support web sites, web site re-designing and planning and complete information architecture.

    Read the article

  • sending email on local machine is not working.

    - by haansi
    I am using my gmail's email account to send emails in asp.net website. It works fine on hosting server but it donot works if I try to sent email on loclserver. Please guide me what I should do to make it sending emails even on localserver ? Do I need to install some smtp server on my local machine ? I have not installed any smtp server on my machine. How and where from I can get smtp server and kindly also guide how I can do its setting to use on local machine. Thnaks Here is my Code public string SendEmail(Email email) { string errmsg = null; if (dt != null) { try { dt = systemrep.GetSystemInfo(); dr = dt.Rows[0]; From = dr["nm_EmailFrom"].ToString(); SMTP = dr["nm_SMTP"].ToString(); Port = dr["amt_Port"].ToString(); EmailId = dr["nm_emailUserId"].ToString(); EmailPassword = dr["nm_emailPassword"].ToString(); DefaultCredations = Convert.ToBoolean(dr["ind_Credentials"].ToString()); MailMessage message = new MailMessage(); SmtpClient smtp = new SmtpClient(); NetworkCredential mailAuthentication = new NetworkCredential(EmailId, EmailPassword); message.To.Add(new MailAddress(email.To)); message.From = new MailAddress(From); message.IsBodyHtml = true; message.Subject = email.Subject; message.Body = email.Message; smtp.UseDefaultCredentials = DefaultCredations; smtp.EnableSsl = true; smtp.Port = 25; smtp.DeliveryMethod = SmtpDeliveryMethod.Network; smtp.Host = SMTP; smtp.Credentials = new NetworkCredential(EmailId, EmailPassword); smtp.Send(message); } catch (SmtpException smtpEx) { errmsg = string.Format("alert('There was a problem in sending the email: {0}');", smtpEx.Message.Replace("'", "\\'")); } catch (Exception generalEx) { errmsg = string.Format("alert('There was a general problem: {0}');", generalEx.Message.Replace("'", "\\'")); } } else errmsg = "An error accured whilte getting email settings from database, process couldn't be completed"; return errmsg; } }

    Read the article

  • "Can't mass-assign protected attributes" with nested protected models

    - by JohnnyFive
    I'm having a hell of a time trying to get this nested model working. I've tried all manner of pluralization/singular, removing the attr_accessible altogether, and who knows what else. restaurant.rb: # == RESTAURANT MODEL # # Table name: restaurants # # id :integer not null, primary key # name :string(255) # created_at :datetime not null # updated_at :datetime not null # class Restaurant < ActiveRecord::Base attr_accessible :name, :job_attributes has_many :jobs has_many :users, :through => :jobs has_many :positions accepts_nested_attributes_for :jobs, :allow_destroy => true validates :name, presence: true end job.rb: # == JOB MODEL # # Table name: jobs # # id :integer not null, primary key # restaurant_id :integer # shortname :string(255) # user_id :integer # created_at :datetime not null # updated_at :datetime not null # class Job < ActiveRecord::Base attr_accessible :restaurant_id, :shortname, :user_id belongs_to :user belongs_to :restaurant has_many :shifts validates :name, presence: false end restaurants_controller.rb: class RestaurantsController < ApplicationController before_filter :logged_in, only: [:new_restaurant] def new @restaurant = Restaurant.new @user = current_user end def create @restaurant = Restaurant.new(params[:restaurant]) if @restaurant.save flash[:success] = "Restaurant created." redirect_to welcome_path end end end new.html.erb: <% provide(:title, 'Restaurant') %> <%= form_for @restaurant do |f| %> <%= render 'shared/error_messages' %> <%= f.label "Restaurant Name" %> <%= f.text_field :name %> <%= f.fields_for :job do |child_f| %> <%= child_f.label "Nickname" %> <%= child_f.text_field :shortname %> <% end %> <%= f.submit "Done", class: "btn btn-large btn-primary" %> <% end %> Output Parameters: {"utf8"=>"?", "authenticity_token"=>"DjYvwkJeUhO06ds7bqshHsctS1M/Dth08rLlP2yQ7O0=", "restaurant"=>{"name"=>"The Pink Door", "job"=>{"shortname"=>"PD"}}, "commit"=>"Done"} The error i'm receiving is: ActiveModel::MassAssignmentSecurity::Error in RestaurantsController#create Cant mass-assign protected attributes: job Rails.root: /home/johnnyfive/Dropbox/Projects/sa Application Trace | Framework Trace | Full Trace app/controllers/restaurants_controller.rb:11:in `new' app/controllers/restaurants_controller.rb:11:in `create' Anyone have ANY clue how to get this to work? Thanks!

    Read the article

  • Want to set 'src' of script to my IP-Won't load in Safari or Chrome. Relative link asp.netmvc

    - by Ozaki
    I have a script that links to the server I am hosting (IP can change) usually I would just use for links: var url ='http://' + window.location.hostname + 'end of url'; But in this case it isnt appearing to be so easy. I have tried: (1) $('#scriptid').attr('src', url); as well as: (2) var script = document.createElement( 'script' ); script.type = 'text/javascript'; script.src = url; $("#insert").append( script ); Now case (2) works loads the script runs the script. But when at the end of my script it hits the 'write data' it decides to replace the entire page with just the data. Any idea on how I can do this? Note: I am using plain html not ASP. With ASP backend that is just the way it has to be. Ok it now is <script src="myscript.js"></script> C# router.AddAsyncRoute("myscript.js"...... It workes in IE & FF. But I get blank pages in Chrome & Safari. I am using document.write to write a script onto my page. Any ideas why Chrome & Safari don't like this? I am so far assuming that in Crhome & Safari it takes longer to run the script therefore launching the document.write after the DOM has loaded therefore replacing the page with a blank one. edit the script im trying to run is a modification of: d = new dTree('d'); d.add(0,-1,'My example tree'); d.add(1,0,'Node 1','default.html'); d.add(2,0,'Node 2','default.html'); d.add(3,1,'Node 1.1','default.html'); d.add(4,0,'Node 3','default.html'); d.add(5,3,'Node 1.1.1','default.html') document.write(d); Any ideas how I can get around this? I am not to sure how to implement an appenChild in this case as the script is changing constantly with live data. So every refresh it will generally have changed some...

    Read the article

  • C++0x Overload on reference, versus sole pass-by-value + std::move?

    - by dean
    It seems the main advice concerning C++0x's rvalues is to add move constructors and move operators to your classes, until compilers default-implement them. But waiting is a losing strategy if you use VC10, because automatic generation probably won't be here until VC10 SP1, or in worst case, VC11. Likely, the wait for this will be measured in years. Here lies my problem. Writing all this duplicate code is not fun. And it's unpleasant to look at. But this is a burden well received, for those classes deemed slow. Not so for the hundreds, if not thousands, of smaller classes. ::sighs:: C++0x was supposed to let me write less code, not more! And then I had a thought. Shared by many, I would guess. Why not just pass everything by value? Won't std::move + copy elision make this nearly optimal? Example 1 - Typical Pre-0x constructor OurClass::OurClass(const SomeClass& obj) : obj(obj) {} SomeClass o; OurClass(o); // single copy OurClass(std::move(o)); // single copy OurClass(SomeClass()); // single copy Cons: A wasted copy for rvalues. Example 2 - Recommended C++0x? OurClass::OurClass(const SomeClass& obj) : obj(obj) {} OurClass::OurClass(SomeClass&& obj) : obj(std::move(obj)) {} SomeClass o; OurClass(o); // single copy OurClass(std::move(o)); // zero copies, one move OurClass(SomeClass()); // zero copies, one move Pros: Presumably the fastest. Cons: Lots of code! Example 3 - Pass-by-value + std::move OurClass::OurClass(SomeClass obj) : obj(std::move(obj)) {} SomeClass o; OurClass(o); // single copy, one move OurClass(std::move(o)); // zero copies, two moves OurClass(SomeClass()); // zero copies, one move Pros: No additional code. Cons: A wasted move in cases 1 & 2. Performance will suffer greatly if SomeClass has no move constructor. What do you think? Is this correct? Is the incurred move a generally acceptable loss when compared to the benefit of code reduction?

    Read the article

  • E4X in ActionScript help needed

    - by voipsecuritydigest.com
    Here is the XML How using E4X read values of nodes <status>??</status> and of node <invisible value="false"/> ? <?xml version="1.0" encoding="utf-8"?> <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" creationComplete="init()"> <fx:Declarations> <!-- Place non-visual elements (e.g., services, value objects) here --> </fx:Declarations> <fx:Script> <![CDATA[ var xml:XML = <iq type="result" id="ss-1"> <query status-min-ver="1" status-max="512" status-list-contents-max="5" status-list-max="3" xmlns="google:shared-status"> <status> ?? </status> <show> default </show> <status-list show="default"> <status> ?? </status> <status> ? </status> <status> ?? </status> </status-list> <status-list show="dnd"> <status> ?? </status> <status> dnd, i have bad mood </status> <status> showering </status> <status> ??_???¦ </status> <status> ? </status> </status-list> <invisible value="false"/> </query> </iq> public function init() { trace(xml.query.invisible.@value); } ]]> </fx:Script> </s:Application>

    Read the article

  • Rails 3: How do I call a javascript function from a js.erb file

    - by user321775
    Now that I've upgraded to Rails 3, I'm trying to figure out the proper way to separate and reuse pieces of javascript. Here's the scenario I'm dealing with: I have a page with two areas: one with elements that should be draggable, the other with droppables. When the page loads I use jQuery to setup the draggables and droppables. Currently I have the script in the head portion of application.html.erb, which I'm sure is not the right solution but at least works. When I press a button on the page, an ajax call is made to my controller that replaces the draggables with a new set of elements that should also be draggable. I have a js.erb file that renders a partial in the correct location. After rendering I need to make the new elements draggable, so I'd like to reuse the code that currently lives in application.html.erb, but I haven't found the right way to do it. I can only make the new elements draggable by pasting the code directly into my js.erb file (yuck). What I'd like to have: - a javascript file that contains the functions prepdraggables() and prepdroppables() - a way to call either function from application.html.erb or from a js.erb file I've tried using :content_for to store and reuse the code, but can't seem to get it working correctly. What I currently have in the head section of application.html.erb <% content_for :drag_drop_prep do %> <script type="text/javascript" charset="utf-8"> $(document).ready(function () { // declare all DOM elements with class draggable to be draggable $( ".draggable" ).draggable( { revert : 'invalid' }); // declare all DOM elements with class legal to be droppable $(".legal").droppable({ hoverClass : 'legal_hover', drop : function(event, ui) { var c = new Object(); c['die'] = ui.draggable.attr("id"); c['cell'] = $(this).attr("id"); c['authenticity_token'] = encodeURIComponent(window._token); $.ajax({ type: "POST", url: "/placeDie", data: c, timeout: 5000 }); }}); }); </script> <% end %> undo.js.erb $("#board").html("<%= escape_javascript(render :partial => 'shared/board', :locals => { :playable => true, :restartable => !session[:challenge]}) %>") // This is where I want to prepare draggables. <%= javascript_include_tag "customdragdrop.js" %> // assuming this file had the draggables code from above in a prepdraggables() function prepdraggables();

    Read the article

  • many-to-many performance concerns with fluent nhibernate.

    - by Ciel
    I have a situation where I have several many-to-many associations. In the upwards of 12 to 15. Reading around I've seen that it's generally believed that many-to-many associations are not 'typical', yet they are the only way I have been able to create the associations appropriate for my case, so I'm not sure how to optimize any further. Here is my basic scenario. class Page { IList<Tag> Tags { get; set; } IList<Modification> Modifications { get; set; } IList<Aspect> Aspects { get; set; } } This is one of my 'core' classes, and coincidentally one of my core tables. Virtually half of the objects in my code can have an IList<Page>, and some of them have IList<T> where T has its own IList<Page>. As you can see, from an object oriented standpoint, this is not really a problem. But from a database standpoint this begins to introduce a lot of junction tables. So far it has worked fine for me, but I am wondering if anyone has any ideas on how I could improve on this structure. I've spent a long time thinking and in order to achieve the appropriate level of association required, I cannot think of any way to improve it. The only thing I have come up with is to make intermediate classes for each object that has an IList<Page>, but that doesn't really do anything that the HasManyToMany does not already do except introduce another class. It does not extend the functionality and, from what I can tell, it does not improve performance. Any thoughts? I am also concerned about Primary Key limits in this scenario. Most everything needs to be able to have these properties, but the Pages cannot be unique to each object, because they are going to be frequently shared and joined between multiple objects. All relationships are one-sided. (That is, a Page has no knowledge of what owns it). Because of this, I also have no Inverse() mapped HasManyToMany collections. Also, I have read the similar question : Usage of ORMs like NHibernate when there are many associations - performance concerns But it really did not answer my concerns.

    Read the article

  • usercontrol hosted in IE renders as a textbox

    - by coxymla
    On my ongoing saga to mirror the hosting of a legacy app on a clean box, I've hit my next snag. One page relies on a big .NET UserControl that on the new machine renders only as a big, greyed out textarea (greyed out vertical scrollbar on the right hand edge. Inspecting the source shows the expected object tag.) This is particularly tricky because nobody seems to know much about hosted UserControls and all the discussions data back to 2002-2004. The page is quite simple: <%@ Page language="c#" Codebehind="DataExport.aspx.cs" AutoEventWireup="false" Inherits="yyyyy.Web.DataExport" %> <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" > <html> <head> <title>DataExport</title> <link rel="Configuration" href="/xxxxx/yyyyy/DataExport.config"> </head> <body style="margin:0px;padding:0px;overflow:hidden"> <OBJECT id="DataExport" style="WIDTH: 100%; HEIGHT: 100%; position:absolute; left: 0px; top:0px" classid="yyyyy.Common.dll#yyyyy.Controls.DataExport" VIEWASTEXT> </OBJECT> </body> </html> The config file referenced: <?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> <sectionGroup name="yyyyy"> <section name="dataExport" type="yyyyy.Controls.DataExportSectionHandler,yyyyy.Common" /> </sectionGroup> </configSections> <yyyyy> <dataExport> <layoutFile>http://vm2/xxxxx/yyyyy/layout.xml</layoutFile> <webServiceUrl>http://vm2/xxxxx/yyyyy/services/yyyyy.asmx</webServiceUrl> </dataExport> </yyyyy> </configuration> What I've checked: Security permissions should be OK, the site is trusted and adding a URL exception to grant FullTrust doesn't change anything. Config file is acessible over the web, layout.xml is accessible, ASMX shows the expected command list Machine.config grants GET permission for the usercontrol.config file. What perhaps looks fishy to me: The DataExport UserControl references Aspose.Excel to generate the spreadsheets it exports. When I navigate to the page and get a blank textbox, then run gacutil /ldl, nothing is in the local download cache. On the working machine, running the same command after viewing the page shows a laundry list of DLLs including the control DLL and the Aspose DLL.

    Read the article

  • C++ Virtual Constructor, without clone()

    - by Julien L.
    I want to perform "deep copies" of an STL container of pointers to polymorphic classes. I know about the Prototype design pattern, implemented by means of the Virtual Ctor Idiom, as explained in the C++ FAQ Lite, Item 20.8. It is simple and straightforward: struct ABC // Abstract Base Class { virtual ~ABC() {} virtual ABC * clone() = 0; }; struct D1 : public ABC { virtual D1 * clone() { return new D1( *this ); } // Covariant Return Type }; A deep copy is then: for( i = 0; i < oldVector.size(); ++i ) newVector.push_back( oldVector[i]->clone() ); Drawbacks As Andrei Alexandrescu states it: The clone() implementation must follow the same pattern in all derived classes; in spite of its repetitive structure, there is no reasonable way to automate defining the clone() member function (beyond macros, that is). Moreover, clients of ABC can possibly do something bad. (I mean, nothing prevents clients to do something bad, so, it will happen.) Better design? My question is: is there another way to make an abstract base class clonable without requiring derived classes to write clone-related code? (Helper class? Templates?) Following is my context. Hopefully, it will help understanding my question. I am designing a class hierarchy to perform operations on a class Image: struct ImgOp { virtual ~ImgOp() {} bool run( Image & ) = 0; }; Image operations are user-defined: clients of the class hierarchy will implement their own classes derived from ImgOp: struct CheckImageSize : public ImgOp { std::size_t w, h; bool run( Image &i ) { return w==i.width() && h==i.height(); } }; struct CheckImageResolution; struct RotateImage; ... Multiple operations can be performed sequentially on an image: bool do_operations( std::vector< ImgOp* > v, Image &i ) { std::for_each( v.begin(), v.end(), /* bind2nd(mem_fun(&ImgOp::run), i ...) don't remember syntax */ ); } int main( ... ) { std::vector< ImgOp* > v; v.push_back( new CheckImageSize ); v.push_back( new CheckImageResolution ); v.push_back( new RotateImage ); Image i; do_operations( v, i ); } If there are multiple images, the set can be split and shared over several threads. To ensure "thread-safety", each thread must have its own copy of all operation objects contained in v -- v becomes a prototype to be deep copied in each thread.

    Read the article

  • Memory management with Objective-C Distributed Objects: my temporary instances live forever!

    - by jkp
    I'm playing with Objective-C Distributed Objects and I'm having some problems understanding how memory management works under the system. The example given below illustrates my problem: Protocol.h #import <Foundation/Foundation.h> @protocol DOServer - (byref id)createTarget; @end Server.m #import <Foundation/Foundation.h> #import "Protocol.h" @interface DOTarget : NSObject @end @interface DOServer : NSObject < DOServer > @end @implementation DOTarget - (id)init { if ((self = [super init])) { NSLog(@"Target created"); } return self; } - (void)dealloc { NSLog(@"Target destroyed"); [super dealloc]; } @end @implementation DOServer - (byref id)createTarget { return [[[DOTarget alloc] init] autorelease]; } @end int main() { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; DOServer *server = [[DOServer alloc] init]; NSConnection *connection = [[NSConnection new] autorelease]; [connection setRootObject:server]; if ([connection registerName:@"test-server"] == NO) { NSLog(@"Failed to vend server object"); } else [[NSRunLoop currentRunLoop] run]; [pool drain]; return 0; } Client.m #import <Foundation/Foundation.h> #import "Protocol.h" int main() { unsigned i = 0; for (; i < 3; i ++) { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; id server = [NSConnection rootProxyForConnectionWithRegisteredName:@"test-server" host:nil]; [server setProtocolForProxy:@protocol(DOServer)]; NSLog(@"Created target: %@", [server createTarget]); [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:1.0]]; [pool drain]; } return 0; } The issue is that any remote objects created by the root proxy are not released when their proxy counterparts in the client go out of scope. According to the documentation: When an object’s remote proxy is deallocated, a message is sent back to the receiver to notify it that the local object is no longer shared over the connection. I would therefore expect that as each DOTarget goes out of scope (each time around the loop) it's remote counterpart would be dellocated, since there is no other reference to it being held on the remote side of the connection. In reality this does not happen: the temporary objects are only deallocate when the client application quits, or more accurately, when the connection is invalidated. I can force the temporary objects on the remote side to be deallocated by explicitly invalidating the NSConnection object I'm using each time around the loop and creating a new one but somehow this just feels wrong. Is this the correct behaviour from DO? Should all temporary objects live as long as the connection that created them? Are connections therefore to be treated as temporary objects which should be opened and closed with each series of requests against the server? Any insights would be appreciated.

    Read the article

< Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >