Scheduling Apache Spark Jobs
Scheduling in Ilum allows you to automate the execution of Apache Spark jobs on كوبرنيتيس clusters at specified intervals using CRON expressions. This is essential for setting up reliable ETL pipelines, regular data analysis, or maintenance tasks that need to run without manual intervention.
يمكنك استخدام ملف jar مع أمثلة شرارة من هذا رابط .
Step-by-Step Guide: Scheduling a Spark Job
-
Navigate to Schedules: Access the جداول section in your Ilum dashboard.
-
Create New Schedule: Click the New Schedule + button to start setting up your automated job.
-
Fill Out Schedule Details:
-
علامة التبويب عام:
- اسم: دخل
ScheduledMiniReadWriteTest - Cluster: Select your target cluster
- فصل: دخل
org.apache.spark.examples.MiniReadWriteTest - Language: Select
سكالا
- اسم: دخل
-
علامة تبويب التوقيت:
- CRON Expression: Select the
تقليدالتبويب. - Custom expression:دخل
@daily
TimingThis configuration will trigger the job to run once every day at midnight. You can adjust this to any valid CRON expression (e.g.,
0 */12 * * *for every 12 hours). - CRON Expression: Select the
-
Configuration Tab:
- Arguments:دخل
/opt/spark/examples/src/main/resources/kv1.txt
- Arguments:دخل
-
علامة تبويب الموارد:
- Jars: Upload the JAR file from the link above.
-
Memory Tab:
- Leave all settings at their default values for this example.
-
-
Submit and Monitor:
- نقر إرسال to create the schedule.
- You can see your new schedule in the list.
- When the scheduled time arrives, a new job instance will be launched automatically. You can view these instances in the وظائف section.
Schedule Configuration Reference
Below is a detailed breakdown of all available settings, organized by tab as they appear in the UI.
- عام
- Timing
- تكوين
- موارد
- ذاكرة
| Parameter | وصف |
|---|---|
اسم | A unique identifier for the schedule. |
عنقود | The target cluster where the scheduled jobs will be executed. |
فصل | The fully qualified class name of the application (e.g., org.apache.spark.examples.SparkPi) or the filename for Python scripts. |
اللغة | The programming language used for the job (سكالا أو بايثون ). |
وصف | An optional description to explain the purpose of this schedule. |
Max Retries | The maximum number of times Ilum will attempt to restart the job if it fails. |
| Parameter | وصف |
|---|---|
Start Time | (Optional) The specific date and time when the schedule should become active. If left blank, it starts immediately. |
End Time | (Optional) The specific date and time when the schedule should stop triggering new jobs. |
CRON Expression | Defines the frequency of the job execution. You can use the visual builders (Minutes, Hourly, Daily, Weekly, Monthly) or select تقليد to enter a standard Unix-style CRON expression (e.g., 0 12 * * * for noon daily). |
| Parameter | وصف |
|---|---|
البارامترات | Key-value pairs for configuring Spark properties (e.g., spark.executor.memory). |
Arguments | Command-line arguments passed to the main method of your application. |
العلامات | Custom labels to categorize and filter your scheduled jobs. |
| Parameter | وصف |
|---|---|
الجرار | Additional JAR files to be included in the classpath. |
Files | Auxiliary files to be placed in the working directory of each executor. |
PyFiles | Python dependencies (.zip, .egg, .py ) for Python jobs. |
المتطلبات | Additional Python packages to install. |
Spark Packages | Maven coordinates for Spark JAR packages to be downloaded. |
| Parameter | وصف |
|---|---|
Executors | The number of executor instances to allocate. |
نوى السائق | The number of CPU cores assigned to the driver. |
النوى المنفذة | The number of CPU cores assigned to each executor. |
ذاكرة السائق | The amount of RAM allocated to the driver. |
ذاكرة المنفذ | The amount of RAM allocated to each executor. |
التخصيص الديناميكي | Enables automatic scaling of executors based on workload. |
الأسئلة المتكررة
Details
Can I schedule PySpark jobs using Ilum?
Yes, Ilum fully supports scheduling for both Scala/Java (JARs) and Python (PySpark) jobs. Simply select "Python" as the language in the General tab and provide your script.Details
How does the retry mechanism work?
If a scheduled job fails, Ilum can automatically attempt to restart it based on the "Max Retries" configuration. This ensures transient issues don't break your pipelines.Details
What CRON formats are supported?
Ilum supports standard Unix-style CRON expressions (e.g.,0 12 * * *) as well as predefined macros like @daily, @hourly, etc.