diff options
24 files changed, 380 insertions, 94 deletions
@@ -1,23 +1,38 @@ # :zap: spark -Spark is a CPU profiling plugin based on sk89q's [WarmRoast profiler](https://github.com/sk89q/WarmRoast). +spark is a performance profiling plugin based on sk89q's [WarmRoast profiler](https://github.com/sk89q/WarmRoast). The latest downloads are [available on Jenkins](https://ci.lucko.me/job/spark/). ## What does it do? -Effectively, it monitors the activity of the server, and records statistical data about which actions take up the most processing time. These statistics can then be used to diagnose potential performance issues with certain parts of the server or specific plugins. +spark is made up of a number of components, each detailed separately below. + +### CPU Profiler (process sampling) +This is the primary component of spark - a lightweight CPU sampler with corresponding web analysis view based on WarmRoast. + +The sampler records statistical data about which actions take up the most processing time. These statistics can then be used to diagnose potential performance issues with certain parts of the server or specific plugins. Once the data has been recorded, a "call graph" can be formed and displayed in a web browser for analysis. -spark will not fix "lag" - it is a tool to help diagnose the cause of poor performance. +A profiler like the one in spark will not magically fix "lag" - they are merely a tool to help diagnose the cause of poor performance. + +### Tick Monitor (server tick monitoring) + +This component monitors the speed at which the game server is processing "ticks". Can be used to spot trends and identify the nature of a performance issue, relative to other system events. (garbage collection, game actions, etc) + +### Memory Inspection (heap analysis & GC monitoring) -## About +This component provides a function which can be used to take basic snapshots of system memory usage, including information about potentially problematic classes, estimated sizes and instance counts corresponding to objects in the JVM. + +Unlike the other "profiler"-like functionality in spark, this component is *not* intended to be a full replacement for proper memory analysis tools. It just shows a simplified view. + +## Features ### WarmRoast features These features are carried over from the upstream "WarmRoast" project. -* The viewer is entirely web-based— no specialist software is required to view the output, just a web browser! +* The viewer is entirely web-based — no specialist software is required to view the output, just a web browser! * Output is arranged as a stack of nested method calls, making it easy to interpret the output * Nodes can be expanded and collapsed to reveal timing details and further child nodes. * The overall CPU usage and contribution of a particular method can be seen at a glance. @@ -29,14 +44,15 @@ These features are carried over from the upstream "WarmRoast" project. WarmRoast is an amazing tool for server admins, but it has a few flaws. -* It is not accessible to some people, because in order to use it, you need to have direct SSH (or equivalent) access to the server. (not possible on shared hosts) -* It can be somewhat clunky to setup and start - firstly, you need to connect to the machine of the server you want to profile. Then, you need to remember the PID of the server, or identify it in a list of running VM display names (not easy when multiple servers are running!) - then allow the profiler to run for a bit, before navigating to a temporary web server hosted by the application. +* It is not accessible to some users, because in order to use it, you need to have direct SSH (or equivalent) access to the server. (not possible on shared hosts) +* It can be somewhat clunky to setup and start (typical steps: ssh into server machine, open up ports / disable firewall rules?, start process, identify target VM, allow profiler to run for a bit, open a web browser & navigate to the temporary web page hosted by the application. not ideal!) * It's not easy to share profiling data with other developers or admins. -* You need to have the Java Development Kit installed on your machine. +* Java Development Kit must be installed on the target machine. I've attempted to address these flaws in spark. -* Profiling is managed entirely using in-game or console commands. You don't need to have direct access to the server machine - just install the plugin as you would normally. +* Profiling is managed entirely using in-game or console commands. +* You don't need to have direct access to the server machine - just install the plugin as you would normally. * Data is uploaded to a "pastebin"-esque site to be viewed - a temporary web server is not needed, and you can easily share your analysis with others! * It is not necessary to install any special Java agents or provide a path to the Java Development Kit @@ -46,7 +62,7 @@ Other benefits of spark compared with other profilers: * This works for both partially deobfuscated Bukkit mappings, as well as for Sponge/Forge (Searge) mappings * No specialist software is required to view the output, just a web browser. -### How does it work? +### spark vs "Real Profilers" The spark (WarmRoast) profiler operates using a technique known as [sampling](https://en.wikipedia.org/wiki/Profiling_(computer_programming)#Statistical_profilers). A sampling profiler works by probing the target programs call stack at regular intervals in order to determine how frequently certain actions are being performed. In practice, sampling profilers can often provide a more accurate picture of the target program's execution than other approaches, as they are not as intrusive to the target program, and thus don't have as many side effects. Sampling profiles are typically less numerically accurate and specific than other profiling methods (e.g. instrumentation), but allow the target program to run at near full speed. @@ -59,7 +75,7 @@ Aikar's [timings](https://github.com/aikar/timings) system (built into Spigot an timings will generally be slightly more accurate than spark, but is (arguably?!) less useful, as each area of analysis has to be manually defined. -For example, timings might identify that a certain listener in plugin x is taking up a lot of CPU time processing the PlayerMoveEvent, but it won't tell you which part of the processing is slow. spark/WarmRoast on the other hand *will* show this information, right down to the name of the method call causing the bad performance. +For example, timings might identify that a certain listener in plugin x is taking up a lot of CPU time processing the PlayerMoveEvent, but it won't tell you which part of the processing is slow. spark/WarmRoast on the other hand *will* show this information, right down to the name of the method call causing the issue. ## Installation @@ -67,10 +83,12 @@ To install, add the **spark.jar** file to your servers plugins/mods directory, a ## Commands -All commands require the `spark.profiler` permission. +All commands require the `spark` permission. + +Note that `/sparkb`, `/sparkv`, and `/sparkc` must be used instead of `/spark` on BungeeCord, Velocity and Forge Client installs respectively. ___ -#### `/profiler start` +#### `/spark start` Starts a new profiling operation. **Arguments** @@ -90,25 +108,29 @@ Starts a new profiling operation. * `--only-ticks-over <tick length millis>` * Specifies that entries should only be included if they were part of a tick that took longer than the specified duration to execute. ___ -#### `/profiler info` +#### `/spark info` Prints information about the active profiler, if present. ___ -#### `/profiler stop` -Ends the current profiling operation, uploads the resultant data, and returns a link to view the call graph. +#### `/spark stop` +Ends the current profiling operation, uploads the resultant data, and returns a link to the viewer. ___ -#### `/profiler cancel` +#### `/spark cancel` Cancels the current profiling operation, and discards any recorded data without uploading it. ___ -#### `/profiler monitoring` +#### `/spark monitoring` Starts/stops the tick monitoring system. **Arguments** * `--threshold <percentage increase>` * Specifies the report threshold, measured as a percentage increase from the average tick duration. +___ +#### `/spark heap` +Creates a new memory (heap) dump, uploads the resultant data, and returns a link to the viewer. + ## License spark is a fork of [WarmRoast](https://github.com/sk89q/WarmRoast), which is [licensed under the GNU General Public License](https://github.com/sk89q/WarmRoast/blob/3fe5e5517b1c529d95cf9f43fd8420c66db0092a/src/main/java/com/sk89q/warmroast/WarmRoast.java#L1-L17). diff --git a/build.gradle b/build.gradle index 2d9f2bc..82ff57a 100644 --- a/build.gradle +++ b/build.gradle @@ -9,6 +9,7 @@ subprojects { ext { pluginVersion = '1.1.0' + pluginDesc = 'Spark is a performance profiling plugin based on sk89q\'s WarmRoast profiler' } sourceCompatibility = 1.8 diff --git a/spark-bukkit/build.gradle b/spark-bukkit/build.gradle index e2f4da0..7fd90d6 100644 --- a/spark-bukkit/build.gradle +++ b/spark-bukkit/build.gradle @@ -6,6 +6,7 @@ dependencies { processResources { from(sourceSets.main.resources.srcDirs) { expand 'pluginVersion': project.pluginVersion + expand 'pluginDesc': project.pluginDesc include 'plugin.yml' } }
\ No newline at end of file diff --git a/spark-bukkit/src/main/java/me/lucko/spark/bukkit/SparkBukkitPlugin.java b/spark-bukkit/src/main/java/me/lucko/spark/bukkit/SparkBukkitPlugin.java index 122ebf1..aebf9a7 100644 --- a/spark-bukkit/src/main/java/me/lucko/spark/bukkit/SparkBukkitPlugin.java +++ b/spark-bukkit/src/main/java/me/lucko/spark/bukkit/SparkBukkitPlugin.java @@ -30,6 +30,9 @@ import org.bukkit.command.CommandSender; import org.bukkit.entity.Player; import org.bukkit.plugin.java.JavaPlugin; +import java.util.Collections; +import java.util.List; + public class SparkBukkitPlugin extends JavaPlugin { private final SparkPlatform<CommandSender> sparkPlatform = new SparkPlatform<CommandSender>() { @@ -41,7 +44,7 @@ public class SparkBukkitPlugin extends JavaPlugin { private void broadcast(String msg) { getServer().getConsoleSender().sendMessage(msg); for (Player player : getServer().getOnlinePlayers()) { - if (player.hasPermission("spark.profiler")) { + if (player.hasPermission("spark")) { player.sendMessage(msg); } } @@ -92,7 +95,7 @@ public class SparkBukkitPlugin extends JavaPlugin { @Override public boolean onCommand(CommandSender sender, Command command, String label, String[] args) { - if (!sender.hasPermission("spark.profiler")) { + if (!sender.hasPermission("spark")) { sender.sendMessage(ChatColor.RED + "You do not have permission to use this command."); return true; } @@ -100,4 +103,12 @@ public class SparkBukkitPlugin extends JavaPlugin { this.sparkPlatform.executeCommand(sender, args); return true; } + + @Override + public List<String> onTabComplete(CommandSender sender, Command command, String alias, String[] args) { + if (!sender.hasPermission("spark")) { + return Collections.emptyList(); + } + return this.sparkPlatform.tabCompleteCommand(sender, args); + } } diff --git a/spark-bukkit/src/main/resources/plugin.yml b/spark-bukkit/src/main/resources/plugin.yml index 94c61a3..922f8c9 100644 --- a/spark-bukkit/src/main/resources/plugin.yml +++ b/spark-bukkit/src/main/resources/plugin.yml @@ -1,10 +1,9 @@ name: spark version: ${pluginVersion} -description: Spark is a CPU profiling plugin based on sk89q's WarmRoast profiler +description: ${pluginDesc} authors: [Luck, sk89q] main: me.lucko.spark.bukkit.SparkBukkitPlugin commands: spark: - description: Main plugin command - aliases: [profiler]
\ No newline at end of file + description: Main plugin command
\ No newline at end of file diff --git a/spark-bungeecord/build.gradle b/spark-bungeecord/build.gradle index bb87a73..7713359 100644 --- a/spark-bungeecord/build.gradle +++ b/spark-bungeecord/build.gradle @@ -6,6 +6,7 @@ dependencies { processResources { from(sourceSets.main.resources.srcDirs) { expand 'pluginVersion': project.pluginVersion + expand 'pluginDesc': project.pluginDesc include 'bungee.yml' } }
\ No newline at end of file diff --git a/spark-bungeecord/src/main/java/me/lucko/spark/bungeecord/SparkBungeeCordPlugin.java b/spark-bungeecord/src/main/java/me/lucko/spark/bungeecord/SparkBungeeCordPlugin.java index 4d54b42..da8ebf9 100644 --- a/spark-bungeecord/src/main/java/me/lucko/spark/bungeecord/SparkBungeeCordPlugin.java +++ b/spark-bungeecord/src/main/java/me/lucko/spark/bungeecord/SparkBungeeCordPlugin.java @@ -32,6 +32,9 @@ import net.md_5.bungee.api.chat.TextComponent; import net.md_5.bungee.api.connection.ProxiedPlayer; import net.md_5.bungee.api.plugin.Command; import net.md_5.bungee.api.plugin.Plugin; +import net.md_5.bungee.api.plugin.TabExecutor; + +import java.util.Collections; public class SparkBungeeCordPlugin extends Plugin { @@ -43,7 +46,7 @@ public class SparkBungeeCordPlugin extends Plugin { private void broadcast(BaseComponent... msg) { getProxy().getConsole().sendMessage(msg); for (ProxiedPlayer player : getProxy().getPlayers()) { - if (player.hasPermission("spark.profiler")) { + if (player.hasPermission("spark")) { player.sendMessage(msg); } } @@ -56,7 +59,7 @@ public class SparkBungeeCordPlugin extends Plugin { @Override public String getLabel() { - return "sparkbungee"; + return "sparkb"; } @Override @@ -95,18 +98,32 @@ public class SparkBungeeCordPlugin extends Plugin { @Override public void onEnable() { - getProxy().getPluginManager().registerCommand(this, new Command("sparkbungee", null, "gprofiler") { - @Override - public void execute(CommandSender sender, String[] args) { - if (!sender.hasPermission("spark.profiler")) { - TextComponent msg = new TextComponent("You do not have permission to use this command."); - msg.setColor(ChatColor.RED); - sender.sendMessage(msg); - return; - } + getProxy().getPluginManager().registerCommand(this, new SparkCommand()); + } + + private final class SparkCommand extends Command implements TabExecutor { + public SparkCommand() { + super("sparkb", null, "sparkbungee"); + } + + @Override + public void execute(CommandSender sender, String[] args) { + if (!sender.hasPermission("spark")) { + TextComponent msg = new TextComponent("You do not have permission to use this command."); + msg.setColor(ChatColor.RED); + sender.sendMessage(msg); + return; + } + + SparkBungeeCordPlugin.this.sparkPlatform.executeCommand(sender, args); + } - SparkBungeeCordPlugin.this.sparkPlatform.executeCommand(sender, args); + @Override + public Iterable<String> onTabComplete(CommandSender sender, String[] args) { + if (!sender.hasPermission("spark")) { + return Collections.emptyList(); } - }); + return SparkBungeeCordPlugin.this.sparkPlatform.tabCompleteCommand(sender, args); + } } } diff --git a/spark-bungeecord/src/main/resources/bungee.yml b/spark-bungeecord/src/main/resources/bungee.yml index fa65fbc..2194180 100644 --- a/spark-bungeecord/src/main/resources/bungee.yml +++ b/spark-bungeecord/src/main/resources/bungee.yml @@ -1,5 +1,5 @@ name: spark version: ${pluginVersion} -description: Spark is a CPU profiling plugin based on sk89q's WarmRoast profiler +description: ${pluginDesc} author: Luck, sk89q main: me.lucko.spark.bungeecord.SparkBungeeCordPlugin diff --git a/spark-common/src/main/java/me/lucko/spark/common/SparkPlatform.java b/spark-common/src/main/java/me/lucko/spark/common/SparkPlatform.java index 57c205f..1de0ec9 100644 --- a/spark-common/src/main/java/me/lucko/spark/common/SparkPlatform.java +++ b/spark-common/src/main/java/me/lucko/spark/common/SparkPlatform.java @@ -27,10 +27,16 @@ import me.lucko.spark.common.command.Command; import me.lucko.spark.common.command.modules.HeapModule; import me.lucko.spark.common.command.modules.MonitoringModule; import me.lucko.spark.common.command.modules.SamplerModule; +import me.lucko.spark.common.command.tabcomplete.CompletionSupplier; +import me.lucko.spark.common.command.tabcomplete.TabCompleter; import me.lucko.spark.sampler.ThreadDumper; import me.lucko.spark.sampler.TickCounter; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collections; import java.util.List; +import java.util.stream.Collectors; /** * Abstract command handling class used by all platforms. @@ -55,7 +61,6 @@ public abstract class SparkPlatform<S> { private final List<Command<S>> commands = prepareCommands(); // abstract methods implemented by each platform - public abstract String getVersion(); public abstract String getLabel(); public abstract void sendMessage(S sender, String message); @@ -75,7 +80,7 @@ public abstract class SparkPlatform<S> { public void executeCommand(S sender, String[] args) { if (args.length == 0) { - sendInfo(sender); + sendUsage(sender); return; } @@ -93,23 +98,41 @@ public abstract class SparkPlatform<S> { } } - sendInfo(sender); + sendUsage(sender); } - private void sendInfo(S sender) { - // todo automagically generate this + public List<String> tabCompleteCommand(S sender, String[] args) { + List<String> arguments = new ArrayList<>(Arrays.asList(args)); + + if (args.length <= 1) { + List<String> mainCommands = this.commands.stream().map(c -> c.aliases().get(0)).collect(Collectors.toList()); + return TabCompleter.create() + .at(0, CompletionSupplier.startsWith(mainCommands)) + .complete(arguments); + } + + String alias = arguments.remove(0); + for (Command<S> command : this.commands) { + if (command.aliases().contains(alias)) { + return command.tabCompleter().completions(this, sender, arguments); + } + } + + return Collections.emptyList(); + } + + private void sendUsage(S sender) { sendPrefixedMessage(sender, "&fspark &7v" + getVersion()); - sendMessage(sender, "&b&l> &7/spark start"); - sendMessage(sender, " &8[&7--timeout&8 <timeout seconds>]"); - sendMessage(sender, " &8[&7--thread&8 <thread name>]"); - sendMessage(sender, " &8[&7--not-combined]"); - sendMessage(sender, " &8[&7--interval&8 <interval millis>]"); - sendMessage(sender, " &8[&7--only-ticks-over&8 <tick length millis>]"); - sendMessage(sender, "&b&l> &7/spark info"); - sendMessage(sender, "&b&l> &7/spark stop"); - sendMessage(sender, "&b&l> &7/spark cancel"); - sendMessage(sender, "&b&l> &7/spark monitoring"); - sendMessage(sender, " &8[&7--threshold&8 <percentage increase>]"); + for (Command<S> command : this.commands) { + sendMessage(sender, "&b&l> &7/" + getLabel() + " " + command.aliases().get(0)); + for (Command.ArgumentInfo arg : command.arguments()) { + if (arg.requiresParameter()) { + sendMessage(sender, " &8[&7--" + arg.argumentName() + "&8 <" + arg.parameterDescription() + ">]"); + } else { + sendMessage(sender, " &8[&7--" + arg.argumentName() + "]"); + } + } + } } } diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/Command.java b/spark-common/src/main/java/me/lucko/spark/common/command/Command.java index 70dc7e8..a28320b 100644 --- a/spark-common/src/main/java/me/lucko/spark/common/command/Command.java +++ b/spark-common/src/main/java/me/lucko/spark/common/command/Command.java @@ -20,14 +20,13 @@ package me.lucko.spark.common.command; -import com.google.common.collect.ImmutableSet; +import com.google.common.collect.ImmutableList; import me.lucko.spark.common.SparkPlatform; import java.util.Collections; import java.util.List; import java.util.Objects; -import java.util.Set; public class Command<S> { @@ -35,20 +34,26 @@ public class Command<S> { return new Builder<>(); } - private final Set<String> aliases; + private final List<String> aliases; + private final List<ArgumentInfo> arguments; private final Executor<S> executor; private final TabCompleter<S> tabCompleter; - private Command(Set<String> aliases, Executor<S> executor, TabCompleter<S> tabCompleter) { + private Command(List<String> aliases, List<ArgumentInfo> arguments, Executor<S> executor, TabCompleter<S> tabCompleter) { this.aliases = aliases; + this.arguments = arguments; this.executor = executor; this.tabCompleter = tabCompleter; } - public Set<String> aliases() { + public List<String> aliases() { return this.aliases; } + public List<ArgumentInfo> arguments() { + return this.arguments; + } + public Executor<S> executor() { return this.executor; } @@ -58,7 +63,8 @@ public class Command<S> { } public static final class Builder<S> { - private ImmutableSet.Builder<String> aliases = ImmutableSet.builder(); + private ImmutableList.Builder<String> aliases = ImmutableList.builder(); + private ImmutableList.Builder<ArgumentInfo> arguments = ImmutableList.builder(); private Executor<S> executor = null; private TabCompleter<S> tabCompleter = null; @@ -71,6 +77,11 @@ public class Command<S> { return this; } + public Builder<S> argumentUsage(String argumentName, String parameterDescription) { + this.arguments.add(new ArgumentInfo(argumentName, parameterDescription)); + return this; + } + public Builder<S> executor(Executor<S> executor) { this.executor = Objects.requireNonNull(executor, "executor"); return this; @@ -82,7 +93,7 @@ public class Command<S> { } public Command<S> build() { - Set<String> aliases = this.aliases.build(); + List<String> aliases = this.aliases.build(); if (aliases.isEmpty()) { throw new IllegalStateException("No aliases defined"); } @@ -92,7 +103,7 @@ public class Command<S> { if (this.tabCompleter == null) { this.tabCompleter = TabCompleter.empty(); } - return new Command<>(aliases, this.executor, this.tabCompleter); + return new Command<>(aliases, this.arguments.build(), this.executor, this.tabCompleter); } } @@ -110,4 +121,26 @@ public class Command<S> { List<String> completions(SparkPlatform<S> platform, S sender, List<String> arguments); } + public static final class ArgumentInfo { + private final String argumentName; + private final String parameterDescription; + + public ArgumentInfo(String argumentName, String parameterDescription) { + this.argumentName = argumentName; + this.parameterDescription = parameterDescription; + } + + public String argumentName() { + return this.argumentName; + } + + public String parameterDescription() { + return this.parameterDescription; + } + + public boolean requiresParameter() { + return this.parameterDescription != null; + } + } + } diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/modules/HeapModule.java b/spark-common/src/main/java/me/lucko/spark/common/command/modules/HeapModule.java index e586971..8752443 100644 --- a/spark-common/src/main/java/me/lucko/spark/common/command/modules/HeapModule.java +++ b/spark-common/src/main/java/me/lucko/spark/common/command/modules/HeapModule.java @@ -59,9 +59,6 @@ public class HeapModule<S> implements CommandModule<S> { } }); }) - .tabCompleter((platform, sender, arguments) -> { - return null; - }) .build() ); } diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/modules/MonitoringModule.java b/spark-common/src/main/java/me/lucko/spark/common/command/modules/MonitoringModule.java index eafc567..a6a227f 100644 --- a/spark-common/src/main/java/me/lucko/spark/common/command/modules/MonitoringModule.java +++ b/spark-common/src/main/java/me/lucko/spark/common/command/modules/MonitoringModule.java @@ -23,9 +23,14 @@ package me.lucko.spark.common.command.modules; import me.lucko.spark.common.SparkPlatform; import me.lucko.spark.common.command.Command; import me.lucko.spark.common.command.CommandModule; +import me.lucko.spark.common.command.tabcomplete.CompletionSupplier; +import me.lucko.spark.common.command.tabcomplete.TabCompleter; import me.lucko.spark.monitor.TickMonitor; import me.lucko.spark.sampler.TickCounter; +import java.util.ArrayList; +import java.util.Collections; +import java.util.List; import java.util.function.Consumer; public class MonitoringModule<S> implements CommandModule<S> { @@ -37,6 +42,7 @@ public class MonitoringModule<S> implements CommandModule<S> { public void registerCommands(Consumer<Command<S>> consumer) { consumer.accept(Command.<S>builder() .aliases("monitoring") + .argumentUsage("threshold", "percentage increase") .executor((platform, sender, arguments) -> { if (this.activeTickMonitor == null) { @@ -58,7 +64,12 @@ public class MonitoringModule<S> implements CommandModule<S> { } }) .tabCompleter((platform, sender, arguments) -> { - return null; + List<String> opts = new ArrayList<>(Collections.singletonList("--threshold")); + opts.removeAll(arguments); + + return TabCompleter.create() + .from(0, CompletionSupplier.startsWith(opts)) + .complete(arguments); }) .build() ); diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/modules/SamplerModule.java b/spark-common/src/main/java/me/lucko/spark/common/command/modules/SamplerModule.java index 853aa5d..2b814e3 100644 --- a/spark-common/src/main/java/me/lucko/spark/common/command/modules/SamplerModule.java +++ b/spark-common/src/main/java/me/lucko/spark/common/command/modules/SamplerModule.java @@ -23,6 +23,8 @@ package me.lucko.spark.common.command.modules; import me.lucko.spark.common.SparkPlatform; import me.lucko.spark.common.command.Command; import me.lucko.spark.common.command.CommandModule; +import me.lucko.spark.common.command.tabcomplete.CompletionSupplier; +import me.lucko.spark.common.command.tabcomplete.TabCompleter; import me.lucko.spark.common.http.Bytebin; import me.lucko.spark.sampler.Sampler; import me.lucko.spark.sampler.SamplerBuilder; @@ -31,6 +33,9 @@ import me.lucko.spark.sampler.ThreadGrouper; import me.lucko.spark.sampler.TickCounter; import java.io.IOException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; import java.util.Set; import java.util.concurrent.CompletableFuture; import java.util.concurrent.TimeUnit; @@ -47,6 +52,11 @@ public class SamplerModule<S> implements CommandModule<S> { public void registerCommands(Consumer<Command<S>> consumer) { consumer.accept(Command.<S>builder() .aliases("start") + .argumentUsage("timeout", "timeout seconds") + .argumentUsage("thread", "thread name") + .argumentUsage("not-combined", null) + .argumentUsage("interval", "interval millis") + .argumentUsage("only-ticks-over", "tick length millis") .executor((platform, sender, arguments) -> { int timeoutSeconds = arguments.intFlag("timeout"); if (timeoutSeconds != -1 && timeoutSeconds <= 10) { @@ -149,7 +159,14 @@ public class SamplerModule<S> implements CommandModule<S> { } }) .tabCompleter((platform, sender, arguments) -> { - return null; + List<String> opts = new ArrayList<>(Arrays.asList("--timeout", "--interval", + "--not-combined", "--only-ticks-over")); + opts.removeAll(arguments); + opts.add("--thread"); // allowed multiple times + + return TabCompleter.create() + .from(0, CompletionSupplier.startsWith(opts)) + .complete(arguments); }) .build() ); @@ -174,9 +191,6 @@ public class SamplerModule<S> implements CommandModule<S> { } } }) - .tabCompleter((platform, sender, arguments) -> { - return null; - }) .build() ); @@ -194,9 +208,6 @@ public class SamplerModule<S> implements CommandModule<S> { } } }) - .tabCompleter((platform, sender, arguments) -> { - return null; - }) .build() ); @@ -213,9 +224,6 @@ public class SamplerModule<S> implements CommandModule<S> { } } }) - .tabCompleter((platform, sender, arguments) -> { - return null; - }) .build() ); } diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/CompletionSupplier.java b/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/CompletionSupplier.java new file mode 100644 index 0000000..f1a6d10 --- /dev/null +++ b/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/CompletionSupplier.java @@ -0,0 +1,56 @@ +/* + * This file is part of LuckPerms, licensed under the MIT License. + * + * Copyright (c) lucko (Luck) <luck@lucko.me> + * Copyright (c) contributors + * + * Permission is hereby granted, free of charge, to any person obtaining a copy + * of this software and associated documentation files (the "Software"), to deal + * in the Software without restriction, including without limitation the rights + * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + * copies of the Software, and to permit persons to whom the Software is + * furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in all + * copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE + * SOFTWARE. + */ + +package me.lucko.spark.common.command.tabcomplete; + +import java.util.Collection; +import java.util.Collections; +import java.util.List; +import java.util.function.Predicate; +import java.util.stream.Collectors; + +public interface CompletionSupplier { + + CompletionSupplier EMPTY = partial -> Collections.emptyList(); + + static CompletionSupplier startsWith(Collection<String> strings) { + if (strings.isEmpty()) { + return EMPTY; + } + return partial -> strings.stream().filter(startsWithIgnoreCasePredicate(partial)).collect(Collectors.toList()); + } + + static Predicate<String> startsWithIgnoreCasePredicate(String prefix) { + return string -> { + if (string.length() < prefix.length()) { + return false; + } + return string.regionMatches(true, 0, prefix, 0, prefix.length()); + }; + } + + List<String> supplyCompletions(String partial); + +}
\ No newline at end of file diff --git a/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/TabCompleter.java b/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/TabCompleter.java new file mode 100644 index 0000000..f8774b2 --- /dev/null +++ b/spark-common/src/main/java/me/lucko/spark/common/command/tabcomplete/TabCompleter.java @@ -0,0 +1,100 @@ +/* + * This file is part of LuckPerms, licensed under the MIT License. + * + * Copyright (c) lucko (Luck) <luck@lucko.me> + * Copyright (c) contributors + * + * Permission is hereby granted, free of charge, to any person obtaining a copy + * of this software and associated documentation files (the "Software"), to deal + * in the Software without restriction, including without limitation the rights + * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + * copies of the Software, and to permit persons to whom the Software is + * furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in all + * copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE + * SOFTWARE. + */ + +package me.lucko.spark.common.command.tabcomplete; + +import com.google.common.base.Preconditions; + +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * Utility for computing tab completion results + */ +public class TabCompleter { + + public static TabCompleter create() { + return new TabCompleter(); + } + + private final Map<Integer, CompletionSupplier> suppliers = new HashMap<>(); + private int from = Integer.MAX_VALUE; + + private TabCompleter() { + + } + + /** + * Marks that the given completion supplier should be used to compute tab + * completions at the given index. + * + * @param position the position + * @param supplier the supplier + * @return this + */ + public TabCompleter at(int position, CompletionSupplier supplier) { + Preconditions.checkState(position < this.from); + this.suppliers.put(position, supplier); + return this; + } + + /** + * Marks that the given completion supplier should be used to compute tab + * completions at the given index and at all subsequent indexes infinitely. + * + * @param position the position + * @param supplier the supplier + * @return this + */ + public TabCompleter from(int position, CompletionSupplier supplier) { + Preconditions.checkState(this.from == Integer.MAX_VALUE); + this.suppliers.put(position, supplier); + this.from = position; + return this; + } + + public List<String> complete(List<String> args) { + int lastIndex = 0; + String partial; + + // nothing entered yet + if (args.isEmpty() || (partial = args.get((lastIndex = args.size() - 1))).trim().isEmpty()) { + return getCompletions(lastIndex, ""); + } + + // started typing something + return getCompletions(lastIndex, partial); + } + + private List<String> getCompletions(int position, String partial) { + if (position >= this.from) { + return this.suppliers.get(this.from).supplyCompletions(partial); + } + + return this.suppliers.getOrDefault(position, CompletionSupplier.EMPTY).supplyCompletions(partial); + } + +}
\ No newline at end of file diff --git a/spark-forge/build.gradle b/spark-forge/build.gradle index 68322c3..be1982d 100644 --- a/spark-forge/build.gradle +++ b/spark-forge/build.gradle @@ -24,6 +24,7 @@ minecraft { processResources { from(sourceSets.main.resources.srcDirs) { expand 'pluginVersion': project.pluginVersion + expand 'pluginDesc': project.pluginDesc include 'mcmod.info' } } diff --git a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeClientSparkPlatform.java b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeClientSparkPlatform.java index d073497..b56dd70 100644 --- a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeClientSparkPlatform.java +++ b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeClientSparkPlatform.java @@ -50,17 +50,17 @@ public class ForgeClientSparkPlatform extends ForgeSparkPlatform { @Override public String getLabel() { - return "sparkclient"; + return "sparkc"; } @Override public String getName() { - return "sparkclient"; + return "sparkc"; } @Override public List<String> getAliases() { - return Collections.singletonList("cprofiler"); + return Collections.singletonList("sparkclient"); } @Override diff --git a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeServerSparkPlatform.java b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeServerSparkPlatform.java index 6b64b95..d667234 100644 --- a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeServerSparkPlatform.java +++ b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeServerSparkPlatform.java @@ -40,7 +40,7 @@ public class ForgeServerSparkPlatform extends ForgeSparkPlatform { List<EntityPlayerMP> players = FMLCommonHandler.instance().getMinecraftServerInstance().getPlayerList().getPlayers(); for (EntityPlayerMP player : players) { - if (player.canUseCommand(4, "spark.profiler")) { + if (player.canUseCommand(4, "spark")) { player.sendMessage(msg); } } @@ -63,11 +63,11 @@ public class ForgeServerSparkPlatform extends ForgeSparkPlatform { @Override public List<String> getAliases() { - return Collections.singletonList("profiler"); + return Collections.emptyList(); } @Override public boolean checkPermission(MinecraftServer server, ICommandSender sender) { - return sender.canUseCommand(4, "spark.profiler"); + return sender.canUseCommand(4, "spark"); } } diff --git a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeSparkPlatform.java b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeSparkPlatform.java index 542c782..5b62f3f 100644 --- a/spark-forge/src/main/java/me/lucko/spark/forge/ForgeSparkPlatform.java +++ b/spark-forge/src/main/java/me/lucko/spark/forge/ForgeSparkPlatform.java @@ -122,7 +122,10 @@ public abstract class ForgeSparkPlatform extends SparkPlatform<ICommandSender> i @Override public List<String> getTabCompletions(MinecraftServer server, ICommandSender sender, String[] args, @Nullable BlockPos blockPos) { - return Collections.emptyList(); + if (!checkPermission(server, sender)) { + return Collections.emptyList(); + } + return tabCompleteCommand(sender, args); } @Override diff --git a/spark-forge/src/main/resources/mcmod.info b/spark-forge/src/main/resources/mcmod.info index 88b1ccb..861646f 100644 --- a/spark-forge/src/main/resources/mcmod.info +++ b/spark-forge/src/main/resources/mcmod.info @@ -1,7 +1,7 @@ [{ "modid": "spark", "name": "spark", - "description": "Spark is a CPU profiling plugin based on sk89q's WarmRoast profiler", + "description": "${pluginDesc}", "version": "${pluginVersion}", "authors": ["Luck", "sk89q"] }]
\ No newline at end of file diff --git a/spark-sponge/build.gradle b/spark-sponge/build.gradle index c2f0efc..581c1ec 100644 --- a/spark-sponge/build.gradle +++ b/spark-sponge/build.gradle @@ -11,4 +11,5 @@ dependencies { blossom { replaceTokenIn('src/main/java/me/lucko/spark/sponge/SparkSpongePlugin.java') replaceToken '@version@', project.pluginVersion + replaceToken '@desc@', project.pluginDesc }
\ No newline at end of file diff --git a/spark-sponge/src/main/java/me/lucko/spark/sponge/SparkSpongePlugin.java b/spark-sponge/src/main/java/me/lucko/spark/sponge/SparkSpongePlugin.java index a9bc99f..23d01aa 100644 --- a/spark-sponge/src/main/java/me/lucko/spark/sponge/SparkSpongePlugin.java +++ b/spark-sponge/src/main/java/me/lucko/spark/sponge/SparkSpongePlugin.java @@ -56,7 +56,7 @@ import javax.annotation.Nullable; id = "spark", name = "spark", version = "@version@", - description = "Spark is a CPU profiling plugin based on sk89q's WarmRoast profiler", + description = "@desc@", authors = {"Luck", "sk89q"} ) public class SparkSpongePlugin implements CommandCallable { @@ -69,7 +69,7 @@ public class SparkSpongePlugin implements CommandCallable { private void broadcast(Text msg) { Sponge.getServer().getConsole().sendMessage(msg); for (Player player : Sponge.getServer().getOnlinePlayers()) { - if (player.hasPermission("spark.profiler")) { + if (player.hasPermission("spark")) { player.sendMessage(msg); } } @@ -134,7 +134,7 @@ public class SparkSpongePlugin implements CommandCallable { @Listener public void onServerStart(GameStartedServerEvent event) { - game.getCommandManager().register(this, this, "spark", "profiler"); + game.getCommandManager().register(this, this, "spark"); } @Override @@ -155,7 +155,7 @@ public class SparkSpongePlugin implements CommandCallable { @Override public boolean testPermission(CommandSource source) { - return source.hasPermission("spark.profiler"); + return source.hasPermission("spark"); } @Override @@ -165,11 +165,11 @@ public class SparkSpongePlugin implements CommandCallable { @Override public Optional<Text> getHelp(CommandSource source) { - return Optional.of(Text.of("Run '/profiler' to view usage.")); + return Optional.of(Text.of("Run '/spark' to view usage.")); } @Override public Text getUsage(CommandSource source) { - return Text.of("Run '/profiler' to view usage."); + return Text.of("Run '/spark' to view usage."); } } diff --git a/spark-velocity/build.gradle b/spark-velocity/build.gradle index aeeba60..b7f02e0 100644 --- a/spark-velocity/build.gradle +++ b/spark-velocity/build.gradle @@ -11,4 +11,5 @@ dependencies { blossom { replaceTokenIn('src/main/java/me/lucko/spark/velocity/SparkVelocityPlugin.java') replaceToken '@version@', project.pluginVersion + replaceToken '@desc@', project.pluginDesc }
\ No newline at end of file diff --git a/spark-velocity/src/main/java/me/lucko/spark/velocity/SparkVelocityPlugin.java b/spark-velocity/src/main/java/me/lucko/spark/velocity/SparkVelocityPlugin.java index 4cec138..8cc10e1 100644 --- a/spark-velocity/src/main/java/me/lucko/spark/velocity/SparkVelocityPlugin.java +++ b/spark-velocity/src/main/java/me/lucko/spark/velocity/SparkVelocityPlugin.java @@ -43,7 +43,7 @@ import net.kyori.text.serializer.ComponentSerializers; id = "spark", name = "spark", version = "@version@", - description = "Spark is a CPU profiling plugin based on sk89q's WarmRoast profiler", + description = "@desc@", authors = {"Luck", "sk89q"} ) public class SparkVelocityPlugin { @@ -59,7 +59,7 @@ public class SparkVelocityPlugin { private void broadcast(Component msg) { SparkVelocityPlugin.this.proxy.getConsoleCommandSource().sendMessage(msg); for (Player player : SparkVelocityPlugin.this.proxy.getAllPlayers()) { - if (player.hasPermission("spark.profiler")) { + if (player.hasPermission("spark")) { player.sendMessage(msg); } } @@ -72,7 +72,7 @@ public class SparkVelocityPlugin { @Override public String getLabel() { - return "sparkvelocity"; + return "sparkv"; } @Override @@ -118,13 +118,13 @@ public class SparkVelocityPlugin { @Subscribe(order = PostOrder.FIRST) public void onEnable(ProxyInitializeEvent e) { this.proxy.getCommandManager().register((sender, args) -> { - if (!sender.hasPermission("spark.profiler")) { + if (!sender.hasPermission("spark")) { TextComponent msg = TextComponent.builder("You do not have permission to use this command.").color(TextColor.RED).build(); sender.sendMessage(msg); return; } SparkVelocityPlugin.this.sparkPlatform.executeCommand(sender, args); - }, "sparkvelocity", "vprofiler"); + }, "sparkv", "sparkvelocity"); } } |