Large subsets of users
-
Hi,
We are using your plugin on a site that has around 10k users, and the subscription list is growing. While using the plugin, we are running out of memory. The problem is that we have the server set to 512MB per session, which is super high already. Our server is beefy, but I’d rather not throw more resources at a grown problem; therefore, I rewrote the generate_csv() function to handle memory a little better. I don’t see a place to submit code, so I figured I would make a ticket here for it.
New generate_csv() function:
if ( !isset( $_POST['_wpnonce-pp-eu-export-users-users-page_export'] ) ) return; check_admin_referer( 'pp-eu-export-users-users-page_export', '_wpnonce-pp-eu-export-users-users-page_export' ); $per_block = 100; $uargs = array( 'fields' => 'all_with_meta', 'role' => stripslashes( $_POST['role'] ), 'number' => 1, 'offset' => 0 ); add_action( 'pre_user_query', array( $this, 'pre_user_query' ) ); $uq = new WP_User_Query( $uargs ); remove_action( 'pre_user_query', array( $this, 'pre_user_query' ) ); $count = $uq->total_users; if ($count < 1) { $referer = add_query_arg( 'error', 'empty', wp_get_referer() ); wp_redirect( $referer ); exit; } $sitename = sanitize_key( get_bloginfo( 'name' ) ); if ( ! empty( $sitename ) ) $sitename .= '.'; $filename = $sitename . 'users.' . date( 'Y-m-d-H-i-s' ) . '.csv'; header( 'Content-Description: File Transfer' ); header( 'Content-Disposition: attachment; filename=' . $filename ); header( 'Content-Type: text/csv; charset=' . get_option( 'blog_charset' ), true ); $exclude_data = apply_filters( 'pp_eu_exclude_data', array() ); global $wpdb; $data_keys = array( 'ID', 'user_login', 'user_pass', 'user_nicename', 'user_email', 'user_url', 'user_registered', 'user_activation_key', 'user_status', 'display_name' ); $meta_keys = $wpdb->get_results( "SELECT distinct(meta_key) FROM $wpdb->usermeta" ); $meta_keys = wp_list_pluck( $meta_keys, 'meta_key' ); $fields = array_merge( $data_keys, $meta_keys ); $headers = array(); foreach ( $fields as $key => $field ) { if ( in_array( $field, $exclude_data ) ) unset( $fields[$key] ); else $headers[] = '"' . $field . '"'; } echo implode( ',', $headers ) . "\n"; $uargs['number'] = $per_block; $uq = new WP_User_Query( $uargs ); $users = (array) $uq->get_results(); while ( is_array( $users ) && !empty( $users ) ): self::_memory_check(75); foreach ( $users as $user ) { $data = array(); foreach ( $fields as $field ) { $value = isset( $user->{$field} ) ? $user->{$field} : ''; $value = is_array( $value ) ? serialize( $value ) : $value; $data[] = '"' . str_replace( '"', '""', $value ) . '"'; } echo implode( ',', $data ) . "\n"; } $uargs['offset'] += $per_block; $uq = new WP_User_Query( $uargs ); $users = $uq->get_results(); endwhile; exit;
To get this to work, I also added a function that cleans up a bit between each sub-subset of users that it pulls, by blowing out wp_cache of the user meta (the actual problem). Here is that function:
protected static function _memory_check($flush_percent_range=80) { global $wpdb; static $max = false; $dec = $flush_percent_range / 100; if ($max === false) { $raw = ini_get('memory_limit'); preg_match_all('#^(\d+)(\s*)?$#', $raw, $matches, PREG_SET_ORDER); if (isset($matches[0])) { $max = $matches[0][1]; $unit = $matches[0][2]; switch (strtolower($unit)) { case 'k': $max *= 1024; break; case 'm': $max *= 1048576; break; case 'g': $max *= 1073741824; break; } } else { $max = 32 * 1048576; } } $usage = memory_get_usage(); if ($usage > $max * $dec) { wp_cache_flush(); $wpdb->queries = array(); } }
Hopefully, this will help others, and hopefully it will get added to the core plugin. Until then, I’ll be running my customized version.
Lou
- The topic ‘Large subsets of users’ is closed to new replies.