Another Guy
Forum Replies Created
-
Forum: Requests and Feedback
In reply to: WordPress ought to issue deprecation notices, really” In any case, the WordPress core upgrade yesterday has a routine that goes through looking for that particular file in all themes and plugins, and then it deletes it.”
Would that be on all upgrades, or only those using the automatic upgrades?
Forum: Requests and Feedback
In reply to: Time For a Major RethinkHi Clayton.
Automated updates are good to a certain extent, but they come with risks. Most wordpress installs are a combination of core code, plug ins, themes, and sometimes custom code. Any one of those things can upset the other, so automated updates some with risks.
So I use other methods to perform updates on the sites I maintain. Some of them require that I use sFTP or similar to transfer files. Other servers with multiple installations get a single upload and then an automated copying to each of the target WP installs. Some MU sites are done with a combination of these methods and more scrupulous checks to assure that the sites hosted do not drop. For what it’s worth, it’s the methods used in the wordpress world for all installs until the automatic updates thing came along.
it’s also generally not the best idea to provide direct FTP access into a server for a remote update, especially if that server hosts many sites or many user accounts. It’s just not a good idea to leave a potential security hole like that available. I do fear the day someone figures out how to tap into the current WP automated process to do harm.
The downside is that security updates that require files to be removed do not work under the current setup. FTP or file copy / replace does not account for files that are no longer in use or in the package. The only way currently to assure that everything is 100% is to literally delete the entire wordpress install (except data files and uploads) and reinstall from scratch – or to run a file by file comparison looking for stray files.
It’s a lot of work – and do that work close to 200 times each time an update comes out, and you can imagine the work involved. Most people just won’t do it, and as a result, wordpress installs that I have seen over 10_ years using it are generally hellacious collections of dead plug ins, unused themes, and other flotsam.
So my suggestion is to create an installer that works for everyone. That would mean moving the installer part of the automated updates out into the open so that the rest of us can use it. Upload the package, run the installer, and it does what it does – including deleting abandoned files.
I also think that it would be really good to include a report scanning all wp core directories and assuring that there are no extra files or files that do not match up to the current version.
Realistically, the same should apply to themes and plugins as well. Each update should include methods to confirm all of the valid files and to remove or flag any that are not valid or not part of the current package. That would very likely mean an extra step in the process of approving themes and plugin updates, as an extra step would have to be taken to verify file changes. The hope is that something like that might have noted the big increase in the number of files in the Yoast plugin update that included a whole bunch of stuff nobody needed – and if not, at least provide a better way for EVERYONE using the product to benefit from keeping systems clean.
It would likely need an interface to allow manual triggering of the updates. That way the admin could control what does and what does not get updated.
Forum: Requests and Feedback
In reply to: Time For a Major RethinkMika, that only applies on automatic updates. It doesn’t apply to the rest of us who cannot risk letting an automated update destroy sites with untested theme updates and incompatible code. Also, with many of the problems coming up in themes and plugins, and with an unreliable standard of what is a major upgrade, it’s hard to see that this really working out
It would be much more inclusive if this was an “instakker” separate from the automation. You upload (to automatically) download all of the packages, theme updates, and the like, and then run the installer – which does the work and CLEANS UP. Right now, you have multiple ways for wordpress to be managed and maintained, and only one of them (automatic updates) will accomplish what is needed to secure a website.
Forum: Requests and Feedback
In reply to: WordPress ought to issue deprecation notices, reallyCode review is important, but by it’s nature, code review could also be part of a solution to tell the difference between dead code and old code that still does the job. Right now, there is no simple way to tell.
The issue of the genericons (and others in the past) are particularly important in an older code base. How many of the themes currently on www.ads-software.com for download might have this code in it? Based on what I see in server logs, there is plenty of bad stuff out there, and the script kiddies are out there hammering away at it. From my current experience, more than 20% of the current spam and attacks on wordpress sites now comes from other compromised wordpress installs.
At some point, there needs to be a better way to tell good from bad without having to just take your chances.
Forum: Requests and Feedback
In reply to: Support forum is limited and not user firendlyMohika, it’s good that you got support on a plugin, because honestly it is very hit and miss. Many creators (even the very big ones) snobbishly don’t participate here at all, and insist that you visit their private walled garden site and provide personal information (and sometimes payment) before they will answer any questions. It’s the nature of the beast.
You have to remember that support here is not generally provided by people who actually code wordpress or work for the company, rather it is mostly a group of volunteers who feel their know better than most of us. it is sometimes a pain when it comes to dealing with ideas and concepts that are outside of the range, as they tend to shoot them down and the coders never see the question to start with.
The general support answer here is “there is a plug in that sort of does that”. It’s perhaps the least helpful answer, especially if the plugin isn’t supported or updated. I truly hope that the requests and feedback section would be more closely followed by core developers and getting more of them on board to respond to ideas, suggestions, and views would likely be a very big plus for the community as a whole.
Forum: Themes and Templates
In reply to: [raindrops] Posts Titles disappeared since upgrade to WP-4.2If you have a bug like this, it might be worth fixing the download and upping the version number. It’s not a lot of fun to download a file 2 days after you hae a fix to find it’s still not fixed.
Forum: Requests and Feedback
In reply to: WordPress ought to issue deprecation notices, reallyAgreed. I think that at a certain point, wordpress needs to come up with a simple way to make it so that ancient and unsupported plugins and themes cannot be used or at least cannot be EASILY used on current installations.
I agree, there should be warnings on the admin panel when plugins are getting old. I also think that wordpress should work to try to keep in contact with developers and ask them to update their plugins from time to time, at least to confirm that they are “live” and maintained. Plugins that don’t get checked by the authors should be marked as “not maintained”, which would discourage the use of plugins that might cause future issues.
Forum: Requests and Feedback
In reply to: Option for Comments DisabledJan, nice but it doesn’t do the whole thing. It addresses only in disabling comments within the theme and such, but appears to still allow comment spammers to connect to your installation and dump their spam at you – or at bare minimum, start up an instance of wordpress that requires both bandwidth and system resources to deal with – because in order to decide if comments are enabled or disabled, it first must starting up an instance, connect to the database, read the site information, read the setup information, and the process the comment request from there.
The disable comments appears to be good for removing the comment stuff from the pages. Spammers don’t give a crap about what is on your page, they spam directly. Which means they still post directly… and even when they are wrong or comments are closed, they still waste system resources. So having comments not working, not answering, not processings (and perhaps the code not even having to be there at all) would be a big step towards making those sorts of site less of a target.
For what it’s worth, a true CMS for business sites would likely not want to have user accounts, comments, or any other method by which people could access the site, except for very controlled feedback scripts that do not touch or deal with the wordpress core in any manner. A good hunk of security can be achieved by not having legacy services on a site that doesn’t use them.
Forum: Requests and Feedback
In reply to: Option for Comments DisabledConsidering the XXS problem in 4.2, this option makes even more sense. If your blog / site does not support comments, it would be nice to have none of that code available for attackers to pick at.
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpLet me add: what would be the point of overriding pagespeed? I can’t think of a reason why I would want to give back significant performance improvements.
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpIpstenu, no, the htaccess is as simple as it’s allowed to be. Has to be when you are using multiple domains, you can’t do anything “magic” without affecting every other site as well.
So the answer is a resounding no. htaccess is plain.
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpActually, pagespeed is pretty much required otherwise wordpress is too slow for google and sites get pushed down results.
Again, why would it do that for MU sites but not stand alone sites on the same server running the same config for all?
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpWHole list on the serve? Pagespeed, memcache, mod expires, and about 200 other things. Hard to tell. I have sites on cloudflare and site off cloudflare, sites on and off mod pagespeed. I cannot spot a relationship between those and the sites that do and do not have the appropriate siteurl, except that MU sites have issues and regular sites don’t – on the same server.
Not sure what is causing it, the only commonality I see is MU. I will have to take a domain out of an MU install and move it just to see.
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpI have done testing by stripping a site down to the basic 2014 theme and disabling all plug ins (a pain in the butt, as it affects every other site on the install) and I still see the same results. I don’t have a single stand alone site with the issue (and I manage many), but all of the MU installs have the issue – even with the same themes and plugin mix
PRURLs seem like a great idea in theory. In reality, there are any number of bots that don’t work with it, and some of them are intensely agro about it. It’s especially annoying if your theme or posts contain a number of images,as each one is converted, creating a bunch of false relative paths for bots (including Googlebot and now Bing) who trip over this problem and flail around generating piles of 404s and server load (because most 404 pages trigger another wordpress page load, which loads the system up).
Can you point me to anything here on the wordpress site about it?
Forum: Networking WordPress
In reply to: MU multi domain site_url missing httpThey may be “correct” but it is not consistent. on MU, they are like this. On non-MU sites, they are the other way. What happens is that bots that look for “http” as a trigger to spot a link are NOT seeing it. Instead, it generates errors. Some see it and attach the document root instead, creating 404s.
It would appear even Googlebot fails on some of these, at least what I am seeing in the apache errorlog.
Why would it be one way on MU sites, and the same links displayed diffently on non-MU sites?
Oh, those sites don’t use SSL – it’s a royal pain to setup for MU sites that use multiple domains.